I made a call to request data from Weight Gurus, which returns in the format of a python dictionary with keys and values of course. I need to take the data retrieved from this call and INSERT each key/value pair as an individual row.
So far I have managed to get the data from Weight Gurus and also establish a connection to my DB within python, but no luck with iterating through the dict to INSERT each value pair into an individual row.
# Login and get the auth token
data = {"email": "", "password": ""}
login_response = requests.post("https://api.weightgurus.com/v3/account/login", data=data)
login_json = login_response.json()
# grab all your data
data_response = requests.get(
"https://api.weightgurus.com/v3/operation/",
headers={
"Authorization": f'Bearer {login_json["accessToken"]}',
"Accept": "application/json, text/plain, */*",
},
)
scale_data_json = data_response.json()
for entry in scale_data_json["operations"]:
print(entry)
import pyodbc
server = ''
database = ''
username = ''
password = ''
driver='{ODBC Driver 13 for SQL Server}'
cnxn = pyodbc.connect('DRIVER='+driver+';SERVER='+server+';PORT=1433;DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = cnxn.cursor()
The dictionary in question is comprised of 9 keys. Each key is a column within my table called BodyComposition. Each key value pair should be an individual row. My table also has an increment ID field for the primary key if that makes a difference.
Solution:
Consider unpacking your collection of dictionaries into key/value tuples and then parameterize the values tuple in the loop. Assuming the below data structure (list of dictionaries):
scale_data_json["operations"] = [{'BMI': 0, 'BodyFat': 10,
'Entrytimestamp': '2018-01-21T19:37:47.821Z',
'MuscleMass': 50, 'OperationType': 'create',
'ServerTimestamp':'2018-01-21T19:37:47.821Z',
'Source':'bluetooth scale',
'Water':37, 'Weight':21},
{'BMI': 0, 'BodyFat': 10,
'Entrytimestamp': '2018-01-21T19:37:47.821Z',
'MuscleMass': 50, 'OperationType': 'create',
'ServerTimestamp':'2018-01-21T19:37:47.821Z',
'Source':'bluetooth scale',
'Water':37, 'Weight':21},
...]
Loop through each dictionary, unpack the values with zip and then bind them in cursor.execute:
# PREPARED STATEMENT
sql = """INSERT INTO BodyComposition (BMI, BodyFat, Entrytimestamp,
MuscleMass, OperationType, ServerTimestamp,
Source, Water, Weight)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
"""
# LOOP, UNPACK, BIND PARAMS
for entry in scale_data_json["operations"]:
keys, values = zip(*entry.items())
cursor.execute(sql, values)
cnxn.commit()