Tuesday 9 March 2021

Python Job on Linux EC2 Machine that drops file on Google Drive

I've seen varous articles have address some of what I want, but not all. Using Python how do I upload a file to a specific folder on Google Drive? Additionally, I want to make sure the token never expires as I want it to run from a remote machine.

Edit I have tried the following using a service account.

SCOPES = [
        'https://www.googleapis.com/auth/drive'
    ]
SERVICE_ACCOUNT_FILE = 'path/to/local/creds.json'
credentials = service_account.Credentials.from_service_account_file(
            SERVICE_ACCOUNT_FILE, scopes=SCOPES)
service = discovery.build('drive', 'v3', credentials=credentials)
SS_SERVICE = build('sheets', 'v4', credentials = credentials)

file_metadata = {
    'name': 'A Test File.csv',
    'parents': [{"kind": "drive#folder",
                 "id":'idoffolder'}],
    'mimeType': 'text/csv'
}
req = service.files().create(body=file_metadata,
                                    media_body='path_to_local_csv')
req.uri = req.uri + '&convert=true'
resp = req.execute()

I retrieved the id of the folder from the URL: https://drive.google.com/drive/u/1/folders/idIwanttouse

This code seems to work in that there aren't errors, however I don't see a file called A Test File.csv in the folder I specified.

Interestingly when I run the following:

service.files().list().execute()

I see the file "A Test File.csv" but no other files in my google drive shared account.

A couple notes: I'm trying to add files to a shared business account that I am a member of, additionally I'm using a service account.

Any ideas?



from Python Job on Linux EC2 Machine that drops file on Google Drive

No comments:

Post a Comment