Sometime developer need to restore large data from Excel, CSV, Json file in to database. But they can`t. because of memory issues, executing time issues, 504 Gateway Time-out issue, and many more issues they are fetching while inserting large data into the database.
So in this post, I would like to show you how to insert large data into a database in mini time spending without load to the server.
This is a laravel data insert script but Worked in other PHP frameworks then use that framework insert script, This example helps all that backend technologies.
cities.json | Data Source
This is my data source JSON file.
In this JSON file, I have more than fifteen thousand rows of all Indian state cities.
I would like to insert that large data into database.
database/cities.json | 15000+ row's
{ "id": 10, "name": "Bombuflat", "district_id": 3, "state_id": 1, "created_at": "2022-01-30 12:50:35", "updated_at": "2022-03-21 12:19:21", "country_id": 101, "district_name": "South Andamans", "state_name": "Andaman and Nicobar Island", "country_name": "India", "sort_code": "IN", "country_code": "91", "status": 1, "cgst": "0", "sgst": "0" }, { "id": 11, "name": "Ferrargunj", "district_id": 3, "state_id": 1, "created_at": "2022-01-30 12:50:35", "updated_at": "2022-03-21 12:19:21", "country_id": 101, "district_name": "South Andamans", "state_name": "Andaman and Nicobar Island", "country_name": "India", "sort_code": "IN", "country_code": "91", "status": 1, "cgst": "0", "sgst": "0" },
First, I fetched all that data into $cities variable from the JSON file. My city JSON file is encoded data so I just decode that data from the file. Using json_decode() function.
Now using loop i will store whole data in another $cityData array variable. So that all fifteen thousand rows are collected into a separate variable using looping.
Next, Using array_chunk() the function I am dividing these fifteen thousand data from an array of one thousan rows an array means array chunk will divide one thousand a single array.
So the chunk will return fifteen arrays. While will one array has one thousand rows.
$cities = json_decode(file_get_contents(public_path('database/cities.json'))); foreach($cities as $key => $citie){ $cityData[] = [ 'id' => $citie->id, 'name' => $citie->name, 'district_id' => $citie->district_id, 'state_id' => $citie->state_id, 'country_id' => $citie->country_id, 'country_name' => $citie->country_name, 'state_name' => $citie->state_name, 'district_name' => $citie->district_name, 'sort_code' => $citie->sort_code, 'country_code' => $citie->country_code, 'status' => $citie->status, 'cgst' => $citie->cgst, 'sgst' => $citie->sgst, 'created_at' => now()->format('Y-m-d H:i:s'), 'updated_at' => now()->format('Y-m-d H:i:s') ]; } $datas = array_chunk($cityData,1000); foreach($datas as $data){ DB::table('cities')->insert($data); }
So after the array chunk $datas inside have fifteen arrays. Again looping fifteen-time and that data insert into the database within micro-seconds.
You can insert more than 10 Milion records at a time into a database table using this chunk method.
Thanks for reading our post.
We always thanks to you for reading our blogs.
Dharmesh Chauhan
(Swapinfoway Founder)Hello Sir, We are brothers origin from Gujarat India, Fullstack developers working together since 2016. We have lots of skills in web development in different technologies here I mention PHP, Laravel, Javascript, Vuejs, Ajax, API, Payment Gateway Integration, Database, HTML5, CSS3, and Server Administration. So you need our service Please Contact Us
Haresh Chauhan
(Co-Founder)We Are Also Recommending You :
- myfatoorah Payment Gateway Integration Laravel
- Join Query Same Table In Laravel
- How To Insert Big Data In The Database Laravel?
- Facade Laravel
- Laravel 10 Arr::pluck() Helper Function Use Example
- How To Add Active Class To Menu Item In Laravel
- How To Secure Laravel Website From Hackers
- How to Create Zip File and Download in Laravel
- How To Integrate Stripe Payment Gateway in Laravel 9
- How To Insert Current Datetime In Database Laravel