Working script gives error when getting more lines from get-connector

Question:

I have written a code to get data from a getconnector. While getting data from 15 days everything works fine, but when getting data from 30 days I get the following error:

Traceback (most recent call last):
  File "C:sourcereposCabman Get-AllCabman Get-AllCabman_Get_All.py", line 
  46, in <module>
    jsondata = json.loads(data.content)
  File "C:Program FilesWindowsAppsPythonSoftwareFoundation.Python.3.10_3.10.2800.0_x64__qbz5n2kfrap0libjson__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "C:Program FilesWindowsAppsPythonSoftwareFoundation.Python.3.10_3.10.2800.0_x64__qbz5n2kfra80libjsondecoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "C:Program FilesWindowsAppsPythonSoftwareFoundation.Python.3.10_3.10.2800.0_x64__qbz5n2kfra8plibjsondecoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

The code I use for writing the JSON data to a file looks like this:

data = requests.get(url_data, headers = {'Host': 'services.%%%.com', 'Authorization': 'earer ' + acces_token})
    status = data.status_code
    jsondata = json.loads(data.content)
    if status == 200:
        desired_dir = "B:Cabman"
        full_path = os.path.join(desired_dir, x+'.json')
        with open(full_path, 'w') as f:
            json_string=json.dumps(jsondata, indent=4)
            f.write(json_string)
        print(x + " succesvol opgehaald.")
    else: 
        print("Error at "+ x)
        print(status)

Het raw-data I get from the get-connector looks like below (personal data is masked with %%%):

b'[{"shiftNumber":null,"egateShiftID":%%%,"weeknumber":6,"day":"zaterdag","hostId":7231,"driverCode":%%%,"driverName":"%%%","vehicleId":2751,"vehicleNr":"2270","licensePlate":"%%%","ShiftStart":"2023-02-11T07:58:33","FirstMovementTimeStamp":"2023-02-11T07:59:06","FirstTripTimeStamp":null,"diffLogonAndFirstMovement":0,"shiftStartLocation":{"latitude":0,"longitude":0},"LastMovementBeforeLogoffTimeStamp":"2023-02-11T08:00:36","ShiftEnd":"2023-02-11T08:01:47","diffLogoffAndLastMovement":1,"shiftEndLocation":{"latitude":%%%,"longitude":%%%},

The final JSON files look like below (personal data is masked with %%%):

[
    {
        "shiftNumber": null,
        "egateShiftID": %%%,
        "weeknumber": 4,
        "day": "zondag",
        "hostId": 7231,
        "driverCode": %%%,
        "driverName": "%%%",
        "vehicleId": 2380,
        "vehicleNr": "1866",
        "licensePlate": "%%%",
        "ShiftStart": "2023-01-29T06:34:44",
        "FirstMovementTimeStamp": null,
        "FirstTripTimeStamp": null,
        "diffLogonAndFirstMovement": null,
        "shiftStartLocation": {
            "latitude": %%%,
            "longitude": %%%
        },
        "LastMovementBeforeLogoffTimeStamp": "2023-01-28T23:13:24",
        "ShiftEnd": "2023-01-29T06:59:14",
        "diffLogoffAndLastMovement": 465,
        "shiftEndLocation": {

It seems there’s a problem with the the line ‘jsondata = json.loads(data.content)’ but this part works fine with smaller quantities of data. It does not seem to be an probleem with the data source as the get statement ( data = requests.get) is not giving an error.

Asked By: Catscanner

||

Answers:

This problem is solved. The issue was not in the code itself but in the limits of the server on the other side. The amount of data is simply to big when pulling 30 days.

Answered By: Catscanner