How to write SQL table data to a pandas DataFrame?

Follow this step by step tutorial to export a relational SQL table to a pandas DataFrame.

Step #1: Create your database table if it doesn’t exist

We will first import the sqlite3 library and the pandas libraries.

import sqlite3
import pandas as pd

Calling either of the packages without importing them first into your script or Jupyter notebook, you will receive a Name Error exception:

NameError: name 'sqlite3' is not defined

We will assume that we have already created a database named hrdb containing a single table named courses.

Step #2: Import SQL table to pandas

We will now use the DataFrame pd.read_sql_query function to connect to the SQLite database and query its records:

my_conn = sqlite3.connect('hrdb.db')
df_courses = pd.read_sql_query("SELECT * FROM Courses", my_conn)
my_conn.close()

The table contents are now kept in a DataFrame in our computer memory.

Step #3: Analyze and visualize your data

We can now easily look into our data:

df_courses.head()

This will return:

Course_IDCourse_NameCourse_Price
01Python Data Visualization1500
12Python Data1200
23R Analysis1300
34Learning Excel1000
And also run some descriptive statistics:

df_courses['Course_Price'].describe()
count       4.0000
mean     1250.0000
std       208.1666
min      1000.0000
25%      1150.0000
50%      1250.0000
75%      1350.0000
max      1500.0000
Name: Course_Price, dtype: float64

We can also now create a simple bar plot with pandas:

df_courses.plot(x = 'Course_Name', y = 'Course_Price', kind='bar', cmap='viridis');

Here’s our bar chart: