question

BK-2724 avatar image
0 Votes"
BK-2724 asked PRADEEPCHEEKATLA-MSFT commented

amazon s3 access points data factory transfer

I'm trying to transfer data over from AmazonS3 to my Azure Data Lake. I'm currently given access to the s3 bucket via an access point.

https://aws.amazon.com/s3/features/access-points/

When I write python it looks like this:

 AWSAccessKeyId='AKIAJQSDEGOZWTVPUETQ'
 AWSSecretKey='xxxxx'
    
 Bucket="arn:aws:s3:us-west-2:xxxxx:accesspoint/datafeed", 
 Prefix="dplp-aea10a31/"
            
 client = boto3.client('s3', aws_access_key_id=AWSAccessKeyId, aws_secret_access_key=AWSSecretKey, region_name='us-east-1' )
    
 dict_objects = client.list_objects_v2(
   Bucket="arn:aws:s3:us-west-2:xxxxx:accesspoint/datafeed", 
   Prefix="dplp-aea10a31/"
 )


The problem I have now is I have to download the file and read it. I'm looking to transfer it to my data lake to make it easier to deal with.

I don't see any way to do this thru Azure Synapse / Data Lake at first look.

azure-data-factory
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

PRADEEPCHEEKATLA-MSFT avatar image
0 Votes"
PRADEEPCHEEKATLA-MSFT answered PRADEEPCHEEKATLA-MSFT commented

Hello @BK-2724,

Thanks for the question and using MS Q&A platform.

There are couple of ways to access Amazon Simple Storage Service in Azure.

Hope this will help. Please let us know if any further queries.


  • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification

  • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators

· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thanks - I tried this but it didn't work, which is why I posted the question. I'll be more specific about the problem I'm having. It works with the python script above, but I can't seem to figure out how to make it work thru Synapse / Data Factory.

I think it has something to do with the access point procedure.

Test connection works when I put in the Access Key and the Secret Key.

It does not work when I put the bucket. Again, this isn't a traditional bucket as it has an access point which I don't have experience with.

Bucket="arn:aws:s3:us-west-2:xxxxx:accesspoint/datafeed",
Prefix="dplp-aea10a31/"

I may just be entering it wrong but I've tried numerous permutations and all say error.


134076-image.png


0 Votes 0 ·
image.png (7.8 KiB)

sorry, I posted my response as an answer instead, please see below

0 Votes 0 ·

Hello @BK-2724,

In order to troubleshoot this issue, could you please share more details:

  • When you say "It does not work when I put the bucket.", what happen when you browser and select the bucket? Did you see any error message?

  • When you say "this isn't a traditional bucket as it has an access point which I don't have experience with", what do you mean by traditional bucket as it has an access point?


0 Votes 0 ·