Database Dump can pose serious challenges at times as the process involves intricate complexities. Data Dumping; carried out in an optimized way can lead to effective transfer of data on a shared server with no performance decline. In order to sort out Server lag issues during data dumping, it is significant to carve out some productive strategies and execute them in a proper way. vteam #446 had worked on multidimensional projects one of which was an information system developed for the members of Reproductive Toxicology Center. There was a requirement to dump complete database/custom records in MS-Access format and upload on site members hosting servers for specified dates.
Our Client was using a shared hosting server that was actually getting stuck in between and causing the process to kill while dumping data and transferring files. Due to the large amount of ‘Data Dump’, server performance was pathetic. So vteam #446 had to face the challenge of dumping and transferring data without any performance downfall on server.
An efficient solution was required to complete dump and upload smoothly. To resolve this issue, vteam #446 implemented Queue worker with Beanstalkd queue on Laravel Forge. Some R&D was done and this combination was selected as the best solution for it.
Queue worker was allowed to consume full server memory instead of using only the shared hosting allocated limit. It manages background big processes in an efficient way and provides an interface to look running or queued jobs. Unix ODBC with Easysoft ODBC driver was used to insert/update/delete data in MS-Access (.accdb) files.
Implementing this solution, memory limit was no longer an issue for big processes.