Experiences of a FileMaker Pro Database Developer

Unresolved (

Comic relief at xkcd.com

Deep in a calculation, we’ve all been there trying to find the missing left or right parenthesis, among other things. After racking our brains, we sometimes just need a little relief and some humor to break up the day. Check out other comic relief at xkcd.com.

Post to Twitter

Working Around Table View Limits

After digging a bit, I found the Table View column limits and I see they have not changed. FileMaker Pro is limited to 99 fields/columns. I wish this information was somewhere else besides the Technical Specifications. In the help, it would be nice to see it listed under the “Working with data in Table View” subject. Searching the KB is just about pointless, but maybe I don’t understand the search logic. I found my answer in a google search, then I was able to find it in the KB once I knew the KB number. For FileMaker 11, it’s KB #7541.
This limit is also based on the layout width. The limit can be less that 99 if the layout width is at or near the max of 110 inches (7,920 pixels). However, this is also inconsistent. I have a Table View Layout with just 79 fields and the layout width is 89 inches. I had to decrease the column widths just for the newly added fields to show up on the layout and in the Modify Table View window. It would at least be helpful if when you add the field via the Modify window, they were grayed out or somehow indicate they are not visible and cannot be displayed on the layout.
If you’re missing fields in Table View, try decreasing the column widths and see if they show up.

Post to Twitter

Best FileMaker Server RAID Configuration

RAID 10 is recommended for FileMaker Server, a combination of redundancy and mirroring and gives the best performance.

If you have RAID 1, mirrored, and separate RAID 5 or 10, the OS should be on the RAID 1 and FileMaker and the Live databases on RAID 5 or 10. The backups should be on a separate disk and channel to take advantage of full async read/writes. You might have better performance making your backups to the RAID 1 drive with the OS. If anything, just use the RAID 1 for backups and leave the OS, FileMaker and the live files on the RAID 5 or 10 drives.

Having the FMS application, backups and logs on a separate partition on a RAID drive does nothing for performance. A partition does not mean separate disks.

UPDATE:  Recently, I did a lot of work with Temp Files, RAM Drives, RAIDs and SSDs. I’ve changed my recommendation on what I wrote here regarding the OS on RAID 1. The ideal setup would be OS, FileMaker Server, and Backups on RAID 10 with the live databases and database temp files on an SSD RAID or SSD PCI Card. Actual FileMaker Server backups happen to a RAM Drive and then a script moves those files to the RAID 10. If you are using FileMaker 12, Progressive backups negate the need for backups to a RAM drive. However, the progressive backups should be located on the RAID 10 volume and not on the SSD where the live databases reside.

Post to Twitter

Configuring FileMaker Server Cache

Fine tuning FileMaker Server cache can greatly improve performance in a large or heavily used solution. So, what is the optimal cache setting? I find that the general rule for setting cache is to set the RAM to the recommend setting that FileMaker Server suggests. I would leave the interval set at the default 1 minute, increasing the interval could leave you at a greater risk of loosing data.

Depending on what your solution is designed to do and how active the solution is, and whether you are writing or reading more data, you may be able to adjust the cache to tweak performance. Think of it this way, if you are writing more data by doing more edits or creating more records, then a smaller cache size may improve performance. Smaller chucks of data are written at a time. If you are reading more data, like running reports, then a larger cache size may improve performance since more data will be accessed from RAM instead of having to keep reading from disk.

In Windows Server 2003 or 2008, I recently confirmed that the memory usage on fmserver.exe is directly related to cache size. Whatever the cache is set to, that will be the minimum memory usage plus whatever extra ram is needed by FMS. The cache memory is reserved, regardless if it’s filled up or it’s empty. It kind of works like a RAM disk.

If you pop up Task Manager and click on the Processes tab and find fmserver.exe, watch the memory usage. Then go to the Admin Console, Configuration, Database Server, Databases and change the Cache size and click save. Watch the memory usage for fmserver.exe and you’ll see it change as you change the cache. There is need to restart FMS. Now you can tweak the cache setting until you find the optimum performance.

Post to Twitter

FileMaker Server Bottlenecks

One of my clients is running FMS on Windows 2003 Server on a maxed out box that is extremely fined tuned. I rarely see the processors over 7%, although at times one of the 8 cores maxes out. The same goes for memory and networking. Although network usage does see more use. Disk usage is active, but I’m running on RAID 10 with 15K disks and backups are done on a RAM disk which an OS script then passes the backups off to an entirely separate hard drive system. In my opinion, FileMaker does not do a good job taking advantage of multiple processors.

It’s all about how FMS queues requests on specific files and one unindexed search, or something else that takes processing power like running a report with unindexed calculations, can bog down an entire group of users. If a particular file is extremely active and you run an unindexed search, that file is handed off to a single core to process one task at a time. You can just imagine what can happen here if you’re at the end of a long queue. I cannot remember if it’s one file or one table within a file and I have not done enough testing here to confirm. But, this is why in a system with multiple modules (and files for each), a solution may “seem” slow in one area, but not in others.

Even a table with hundreds of fields is much better split into smaller tables each with fewer fields since server passes all data in all fields from a single table over to the client when the client has to process certain requests. This can be another major bottleneck.

Post to Twitter

Improve Finds

When the constrain find option first came out, I was able to drastically speed up some search routines. Finds with exact field matches (==) were being used and it was SLOOOW when running on a table with over a million + records. Removing the exact field matches in the first search pass to find a subset of records, then adding a Constrain Find using the exact field match (==) to run on the subset of the records, greatly improved the performance of the search.

Post to Twitter

Prevent Error 101, Server-Side

When running a script to loop through records and you have the Exit after last checked, you get a 101 error message, record is missing. I “understand” this is supposedly expected behavior, however to me it’s an erroneous error since you have the box checked to exit after last. It’s kind of like “duh, I know the next record is missing, that’s why I’m exiting this loop.”
This is a problem when running the script server-side. The schedule indicates an error and the log viewer shows “FileMaker Script Error.” The script actually performed OK and was successful, but reported the 101 error and shows the error status.
I tried trapping for error 101 and then just exiting the script, but the error was still reported to the FMS log and displayed in the Schedule Window as “FileMaker Script Error.”
The only way around the error is to put a condition to check if you are on the last record, something like:
Exit Loop If [Get ( FoundCount ) = Get ( RecordNumber )]
Put this exit loop condition right before the Go To Next Record script step.

Post to Twitter

Data Interchange

Recently I’ve been working on exports from FileMaker that are being used to import into other systems. FileMaker is great at accepting and interpreting many formats, but not so true the other way around. I find it interesting how inflexible those other systems are.

When exporting data in CSV format, FileMaker puts quotes around the data in every field, including number fields. Excel seems to only quote data when there is a comma in the data. I was told by another developer of another database system that the CSV format I’m supplying is not in “standard CSV format.” But, there is NO industry standard CSV format and it’s acceptable to put quotes around the data of every field. In order to make the client happy and conform to their inflexible CSV import, I have to open and save the file in Excel before posting to their FTP site. An extra step that’s totally unnecessary.

I had similar experiences with tab delimited files exported from FileMaker. A client requested the files as pipe delimited, but tab would also be acceptable although not preferred. Most of the work I do is on a Mac, but I make sure its compatible across both platforms. I usually export in a Unicode format. However, I was told there were no carriage returns, so I exported as ASCII (dos) format. I was again told there are no carriage returns in the document and everything is in one long line. They also wanted the file saved with a txt extension instead of a tab extension stating tab was specific for Tableau. This is just not true. Although .tab may be an extension for Tableau, it’s not the only application associated with the extension.

The actual problem with the file is the issue with Carriage Returns (CR) and Line Feeds (LF). Older Macs use a Carriage Return and newer Macs use a Line Feed, like their Unix counterparts, including Linux. Windows requires both Carriage Return and Line Feed (CRLF). If you transfer the files via FTP in ASCII mode the conversion between platforms is supposed to be taken care automatically. This does not always work from my experience. I’m looking for a solution now and will post it once I have it. In the meantime, I’m exporting on a PC.

Although FileMaker is flexible with file formats it can import, many systems are not. Keep this in mind when exporting data for use in other systems.

Post to Twitter

Normalizing Data with Google Refine

If you have messy data, try Google Refine. There is nothing like dealing with messy data that is all over the place. We often have to normalize the data before importing and this is a great tool to help with data normalization.

Post to Twitter


Highcharts are interactive JavaScript charts. These can be displayed in any browser and can also be viewed directly in FileMaker using the web viewer.

Click here to see how I recently used Highcharts. Make sure to check out the demo gallery to see all the charting options available.

Post to Twitter