Converting a PostgreSQL table to pandas dataframe
Like we did above, we can also convert a PostgreSQL table to a pandas dataframe using the read_sql_table() function as shown below. Here, let us read the loan_data table as shown below.
Syntax: pandas.DataFrame.read_sql_table(table_name, con = engine_name, columns)
Explanation:
- table_name – Name in which the table has to be stored
- con – Name of the engine which is connected to the database
- columns – list of columns that has to be read from the SQL table
Python3
# import necessary packages import pandas as pd import psycopg2 from sqlalchemy import create_engine # establish connection with the database engine = create_engine( "dialect+driver//username:password@hostname:portnumber/databasename" ) # read the postgresql table table_df = pd.read_sql_table( "loan_data" , con = engine, columns = [ 'Loan_ID' , 'Gender' , 'Married' , 'Dependents' , 'Education' , 'Self_Employed' , 'ApplicantIncome' , 'CoapplicantIncome' , 'LoanAmount' , 'Loan_Amount_Term' , 'Credit_History' , 'Property_Area' , 'Loan_Status' ], ) # print the postgresql table loaded as # pandas dataframe print (table_df) |
Output:
Connecting Pandas to a Database with SQLAlchemy
In this article, we will discuss how to connect pandas to a database and perform database operations using SQLAlchemy.
The first step is to establish a connection with your existing database, using the create_engine() function of SQLAlchemy.
Syntax:
from sqlalchemy import create_engine
engine = create_engine(dialect+driver://username:password@host:port/database)
Explanation:
- dialect – Name of the DBMS
- driver – Name of the DB API that moves information between SQLAlchemy and the database.
- Username, Password – DB User credentials
- host: port – Specify the type of host and port number.
- Database – Database name