Handling Excel Files in Laravel (Part 2)
Time to read: 8 minutes
Handling Excel Files in Laravel (Part 2)
The first part of this tutorial series taught how to import data from an Excel file into a Laravel application as well as how to export data to an Excel file.
This, the second and final part of this series, will teach you how to validate data before importing it from an Excel file, handle file uploads securely, handle errors, and some other features that the Excel library has to offer.
Prerequisites
PHP 8.2 or newer
Access to a MySQL database; use Laravel Sail to save time and effort setting one up
Composer globally installed
An API client such as curl, Postman, Insomnia, and Hoppscotch
Your preferred text editor or IDE
Prior knowledge of Laravel and PHP would be helpful, but is not mandatory
Excel file validation
In the first section of this series, the data from the Excel file was imported as-is. Because of that, it was unclear what data type should be expected for the columns in each row. Ideally, you should check each row before inserting it into the database.
The Excel library we're using lets you do this by taking advantage of Laravel's built-in validation mechanisms. This is achieved through the WithValidation concern, which allows you to define validation rules for each row of data loaded from the Excel file. The concern requires a rules()
method, this method returns an array of validation rules, similar to how you would define validation rules in Laravel's regular form requests or controllers.
To test this, we would modify the app/Imports/FinancialDataImport.php file made in the previous tutorial and add the rules()
method to it.
Since the Excel file header is large, we will only write a validation rule for one of the columns. The code above now requires that every row in the "segment" column must be non-empty and be an integer. This is not the case for the data from the Excel file.
If you go over the file, you'll see that the values in that column are all strings. Next, we wrap a try-catch block around the import()
method to handle any errors that may arise.
Now, update the code in app/Http/Controllers/ExcelController.php to match the code below:
Then, start the Laravel app and run a fresh migration using the commands below to remove the already-added data in the previous series from the database.
Finally, visit the import route. You should see that an error indicating that the segment field must be an integer was returned. If you also check the database you will see that no row was imported into it.
Now, go back to app/Imports/FinancialDataImport.php, change the integer
in the rule()
method to string
and revisit the import route. This time, it will run smoothly returning a JSON array containing the word "done".
Batch inserts and upserts
When importing large data sets from Excel files, batch inserts are generally more efficient than single inserts. Batch inserts can boost speed by lowering the number of database queries required.
In the sample file we've been working with, the total number of rows is around 700, which is quite large and might put a significant load on the server if the server processes many more requests like this in a real-world scenario.
To improve performance, we would split the data into smaller chunks of 50 records or less and insert each one. As expected, the library offers a seamless and clean way to do this.
Update app/Imports/FinancialDataImport.php to reflect the code below:
Now, if you revisit the "/import" route you will see that the request took less time to be completed. You can test this out by using any API client you are familiar with (such as Postman, Insomnia, Hoppscotch, and the like).
The term upsert is a mix of the terms update and insert. In the context of relational databases, an upsert is a database operation that updates an existing row if a provided value already exists in a table, and inserts a new record if the supplied value does not exist.
Currently, if you examine the database table created in the first part of this series, you will see duplicate row entries in the database, which should not be the case in a "real-world" scenario.
To address this issue, the library includes an interface called WithBatchInserts
, which, like the WithValidation
interface, requires the definition of a method called uniqueBy()
. A unique key is required to determine whether or not the imported row already exists. Our Excel data has no uniqueness, so I'll use some example code to explain how this works.
Suppose we run an email marketing service and have subscriber information recorded in an Excel file. We want to import this data into our application, but there is a chance of duplicate data in the file.
One way duplicate data may affect the application is when emails are sent twice. To avoid this, we'd create a new file in the app/Imports directory named UsersImport.php and paste the following code into it:
In the above example, if a user already exists with the same email address, the row will be modified rather than being created as a fresh row. In the background, this functionality makes use of Laravel's upsert()
method, using the uniqueBy()
method for the second parameter. This method provides the columns that uniquely identify entries within the affiliated table. It is worth noting that the provided columns must have a primary or unique covering index.
Chunk reading
I know what you're thinking: "Chunk reading?" But wait, let me explain. Although I wanted to discuss it before batch inserts, I did it now to ensure you understand these concepts.
Importing large Excel files may result in performance difficulties and have a significant impact on memory use as it's quite resource-intensive. This is because the library will normally attempt to read the complete file into memory first, before storing it in the database. Chunk reading avoids this as it scans the data before it is imported and saved.
The WithChunkReading
interface can help to reduce this increase in memory use. This will read the spreadsheet in chunks, keeping memory use under control. In the app/Imports/FinancialDataImport.php created in the first series, implement the WithChunkReading
interface and define the Chunksize()
method, as seen below:
In this example, we implemented the WithChunkReading
interface, which has the chunkSize()
method. This method lets you define how many records to process in each batch. When you use this class, the library will read and process the Excel file in smaller chunks, which reduces memory use and improves overall import efficiency.
Combining batch inserts and chunk reads, however, produces the best results in terms of time and memory consumption. So, update app/Imports/FinancialDataImport.php to reflect the changes in the code below.
Export from a view
The maatwebsite/excel package in Laravel provides functionality that allows you to export data from a custom view. The FromView
interface enables you to generate highly customized Excel exports that go beyond the standard data export capability.
In the first series, the data from the users table was exported directly from the UsersExport
class. In cases where the table has complex relations and building a query for it becomes tricky the FromView
interface comes in helpful. As you would return a view from your controller in a typical Laravel application, you handle exports to a view in the same manner.
Update the following code in app/Exports/UsersExport.php file to match the code below:
Next create a users.blade.php file in resources/views/exports and add the following content to the file:
One good thing about this method of export is that you can also add table headers conventionally as you would on an HTML table as opposed to using the table concern.
When the exportUsers()
method is called by visiting the "/export" route, the library will render the exports.users
view, convert it to an Excel file, and prompt the user to download the file.
Advanced formatting and styling
In the previous section we saw how Blade Templates can be converted into an Excel spreadsheet. You can further customize them to include additional formatting and styling. You can apply various styles to individual cells, such as font styles, alignment, borders, and background colors.
To see this work, update resources/views/exports/users.blade.php to match the following:
Autosize while exporting
If you inspect the recently downloaded Excel file, you will see how misaligned the columns are in width, and how difficult it is to read the content of the file without manually adjusting the width of each column. Implementing the ShouldAutoSize
interface in the UsersExport.php class solves this issue.
Now, when the file is downloaded, the column will be auto-sized to fit the size of each row value, ensuring that they are visible.
File Upload and Security
When expecting file uploads from users, it is essential not to trust the input. Users should be considered guilty or malicious until proven otherwise. So, always validate the files the user uploads to your server to avoid unexpected application errors or a possible crash.
For example, the user can send a .txt file instead of a .xslx which would break the application. So, to avoid this, there are a few things to remember.
Generate random file names
One common security technique is to generate a random file name rather than use the user-provided file name. This helps to prevent possible vulnerabilities such as file naming clashes or path traversal attacks, in which an attacker attempts to access restricted directories or files on the server.
In this example, we're generating a random 30-character file name and using the storeAs()
method to save the file in the /public/uploads directory. This approach ensures that the uploaded file has a unique name, making it much harder for an attacker to predict or guess the file's location.
Always validate user’s input
As previously emphasized, do not trust user input. In the code sample to upload user data below, we use Laravel's built-in validate()
function to ensure that:
A file is uploaded
Has one of the requested MIME types (xlsx, xls, or csv)
Does not exceed the maximum file size of 2048 KB
This keeps users from uploading unsupported or overly large files, which might create problems or consume server resources.
To do that, create an import_user()
method in app/Http/Controllers/ExcelController.php and paste the code below.
Following that, update the model()
function in app/Imports/UsersImport with the code below.
In the route/web.php file, add the code below:
Finally create a file named import.blade.php in the resources/view directory and paste this content into it.
With everything set up, you can now upload an Excel file via the "users/import" route (http://localhost:8000/import/user) with a header that matches the one provided in the UsersImport
class.
If the users in the Excel file were successfully imported, you'll then see a confirmation message displayed, as in the screenshot below.
Don’t make uploaded files directly accessible
If uploaded files cannot be directly accessed in the browser, then they can’t be accessed to be executed. It is best to save uploaded files in a safe area, such as the storage/app/uploads directory, which is not directly accessible from the web. This helps to prevent unwanted access to submitted data.
You can configure the storage settings in the config/filesystems.php file:
Conclusion
In this series, we've covered everything from project setup and library installation to reading or importing, exporting or writing, validating, and safeguarding Excel files. By combining Laravel and the maatwebsite/excel library, you can effectively add Excel file manipulation capabilities into your online applications, speeding up data processing and many more. Happy building!
Prosper is a freelance Laravel web developer and technical writer who enjoys working on innovative projects that use open-source software. When he's not coding, he searches for the ideal startup opportunities to pursue. You can find him on Twitter and Linkedin.
Related Posts
Related Resources
Twilio Docs
From APIs to SDKs to sample apps
API reference documentation, SDKs, helper libraries, quickstarts, and tutorials for your language and platform.
Resource Center
The latest ebooks, industry reports, and webinars
Learn from customer engagement experts to improve your own communication.
Ahoy
Twilio's developer community hub
Best practices, code samples, and inspiration to build communications and digital engagement experiences.