Skip to content

Importing JSON file from a public S3 url #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ghost opened this issue Jan 18, 2018 · 0 comments
Open

Importing JSON file from a public S3 url #1

ghost opened this issue Jan 18, 2018 · 0 comments

Comments

@ghost
Copy link

ghost commented Jan 18, 2018

This code works great - just what I needed, I pumped up the memory allocation to get very fast write speeds for 150 records with 6 attributes. Results below:

128mb COLD 22000ms (22 seconds) when data is empty. 5000ms warm, filled
256mb COLD 5000ms (5 seconds) when data is empty. 2000ms warm, filled
512mb COLD 3000ms (3 seconds) when data is empty. 1500ms warm, filled
1GB COLD 1500ms (1.5 seconds) when data is empty. 600ms warm, filled
1.5GB COLD 600ms (0.6 seconds) when data is empty. 300ms warm, filled
2GB COLD 2200ms (0.7 seconds) when data is empty. 300ms warm, filled

Struggling to get an import from public S3 url or http url though. Do you know if a HTTP package is required or perhaps S3 rights in the Lambda role perhaps?

Tried var localData = "https://s3.eu-west-2.amazonaws.com/mybucketname/MOCK_DATA.json" and a few other varients.

This repo shows using internal S3 data but would still be nice to get it working with a http url
https://github.com/MarceloAlves/lambda-process-json/blob/master/index.js

UPDATE: I have var request = require('request') imported but just need to work out how to implement

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

0 participants