AWS Kinesis Stream In Detail Review
I am new to AWS. I have implemented some functionalities in aws using java. My requirement is to insert a csv of 50MB to RDS PostgreSQL instance at a time.
I tried with aws lmabda service. But after 5 minutes lambda will be stopped so i dropped that way.(Limitation of lambda function)
The second step I followed I wrote a java lambda code of s3 event which will read the csv file falls on s3 to a kinesis stream using putrecord command. According to my understanding, kinesis is capable of read csv file record by record. This kinesis stream will invoke a second lambda function which is saving data to postgreSQL.
Everything was fine. But my confusion is that only 32000 record is inserting. I have 50000 records in my csv. according to kinesis stream it will reading each row as a record so each time it will invoke lambda separately right? so why it is not saving completely?
One more question in my kinesis stream configured like below.
Also in my lambda i configured kinesis as
Is this the correct configuration for my requirement? if I give batchsize as 1 will my function insert the complete record?Please let me know ur knowledge about this. It would be a great help from you thanks in advance!!!!
Same question i have posted in stackoverflow.