Skip to content

Importing JSON file from a public S3 url #1

Open
@ghost

Description

This code works great - just what I needed, I pumped up the memory allocation to get very fast write speeds for 150 records with 6 attributes. Results below:

128mb COLD 22000ms (22 seconds) when data is empty. 5000ms warm, filled
256mb COLD 5000ms (5 seconds) when data is empty. 2000ms warm, filled
512mb COLD 3000ms (3 seconds) when data is empty. 1500ms warm, filled
1GB COLD 1500ms (1.5 seconds) when data is empty. 600ms warm, filled
1.5GB COLD 600ms (0.6 seconds) when data is empty. 300ms warm, filled
2GB COLD 2200ms (0.7 seconds) when data is empty. 300ms warm, filled

Struggling to get an import from public S3 url or http url though. Do you know if a HTTP package is required or perhaps S3 rights in the Lambda role perhaps?

Tried var localData = "https://s3.eu-west-2.amazonaws.com/mybucketname/MOCK_DATA.json" and a few other varients.

This repo shows using internal S3 data but would still be nice to get it working with a http url
https://github.com/MarceloAlves/lambda-process-json/blob/master/index.js

UPDATE: I have var request = require('request') imported but just need to work out how to implement

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions