Invalid JSON data returned from the API (data type inconsistency)


I have noticed that the API outputs invalid JSON data, when i say invalid i mean invalid in the context of the data being outputted, its only valid if you are happy with receiving data that should be numbers as strings.

The issue is that some of the JSON data which should be numbers is being sent by the API as strings, this is not the case for all of the data which i would expect to be numbers, just some of it.

Here is an example of numbers correctly outputted in JSON format:

For Users, the data type is correct for “permission_level” but incorrect for “opening_mileage” as it should be a number not a string, “0.0” quoted as seen below is a string representation.
The output needs to be like this: “opening_mileage”: 0.0,

“users”: [
“url”: “”,
“first_name”: “joe”,
“last_name”: “bloggs”,
“email”: "",
“role”: “Director”,
“permission_level”: 8,
“opening_mileage”: “0.0”,
“updated_at”: “2017-04-04T00:00:43.000Z”,
“created_at”: “2016-12-09T00:00:31.000Z”

For Timeslips, “hours” should be a number not a string, “7.0” quoted is a string is a string representation.
The output needs to be like this: “hours”: 7.0,

“timeslips”: [
“url”: “”,
“user”: “”,
“project”: “”,
“task”: “”,
“dated_on”: “2017-03-20”,
“hours”: “7.0”,
“updated_at”: “2017-03-22T13:00:15.000Z”,
“created_at”: “2017-03-22T13:00:15.000Z”

For Tasks, “billing_rate” should be a number not a string
It seems that booleans are correct from what i have seen so far.

“tasks”: [
“url”: “”,
“project”: “”,
“name”: “Design”,
“is_billable”: true,
“billing_rate”: “5.0”,
“billing_period”: “hour”,
“status”: “Active”,
“created_at”: “2016-12-09T00:58:32.000Z”,
“updated_at”: “2016-12-09T00:58:32.000Z”

For Projects, “normal_billing_rate” and “hours_per_day” should be a number not a string
“budget” is correct

“project”: {
“url”: “”,
“name”: “Project 1”,
“contact”: “”,
“currency”: “GBP”,
“created_at”: “2016-12-09T00:53:02.000Z”,
“updated_at”: “2017-03-22T00:28:48.000Z”,
“starts_on”: “2016-12-01”,
“ends_on”: “2017-03-23”,
“budget”: 80,
“is_ir35”: false,
“contract_po_reference”: “00111”,
“status”: “Active”,
“budget_units”: “Hours”,
“normal_billing_rate”: “5.0”,
“hours_per_day”: “7.5”,
“uses_project_invoice_sequence”: false,
“billing_period”: “hour”

As can be seen the output is not consistent and caused a moment of “what the heck” when first discovered, for now i am patching the output to convert the strings to numbers.

There may be other examples but those listed are what i have found so far.
From what i have seen so far, the issue seems to be with floating point numbers, i don’t know what JSON serializer you are using but it doesn’t seem to be standards based.


Hi Chris,

The examples you’ve given would indeed be invalid if the values were floats, however these are decimal values and as such should not be output unquoted in JSON as the deserialiser may assume them to be floats.

From a quick search on Google, it seems this is not an uncommon practice in JSON APIs - PayPal for example use the same convention according to this stack overflow post:

The best explanation I can see for this is in the Rails documentation:

A BigDecimal would be naturally represented as a JSON number. Most libraries, however, parse non-integer JSON numbers directly as floats. Clients using those libraries would get in general a wrong number and no way to recover other than manually inspecting the string with the JSON code itself.

That’s why a JSON string is returned. The JSON literal is not numeric, but if the other end knows by contract that the data is supposed to be a BigDecimal, it still has the chance to post-process the string and get the real value.

I’ll check and update our documentation to make sure this is explicit - it could well be that understanding of this behaviour is assumed thus causing your “what the heck” moment.

Kind Regards,

Thanks Paul, that is surprising, however unless the API is sending non-integer numbers from the backend that have enough precision for it to potentially be lost when converting to float (if they are stored as double or decimal on the backend) it wont matter. Its up to the developer to then convert to a more precise floating point number representation such as double or decimal before performing any math on the numbers.

Looking at the API documentation, the fields in question are indeed listed as “Decimal” , but if the API is to break from the standard to account for deficiencies in client libraries (which may or may not matter) it should be clearly documented, maybe list it as “String (decimal representation)”.

I also noticed originally that every field that is not a date, an id field or string seems to be listed as decimal regardless of the expected precision (eg. percentages), even items such as the quantity on an invoice (does anyone invoice for half an item?) is decimal, so it made me think that regardless of the precision, it’s just listed as decimal.

Hi Chris,

Yes, most fields are indeed decimal. Amongst our customers there are common use cases where decimals are important, for example:

  • People do indeed invoice for half an item. They may be selling their product by weight, or they may be invoicing for 2.5 weeks of work. Not everything that our users sell can be measured in single units, hence the need to support fractional quantities.
  • Percentages (usually around sales tax) are not always nice easy numbers. If for example a user receives a bill which contains a mixture of vat and non vat rated items, the overall sales tax percentage is probably not going to be to just a couple of decimal places. In order to support this and store our users’ data accurately then we need to be able to allow higher precision.

As I’ve said I’ll update our documentation to more explicitly show what the representation of our decimal numbers is, but the bottom line is that our system stores all public facing fractional values as decimal, deals with them in code as decimal, and renders them as decimal. Even for operations on simple floating point numbers, error can creep in. For example:

irb(main):002:0> 1.03 - 0.42
=> 0.6100000000000001

This is why all the non integer values returned from our API are explicitly decimal strings rather than anything which could potentially be mistaken for a floating point value.

Our JSON serializer’s standard is not uncommon and unlikely to change. Maybe there’s some means in your deserialiser to configure it to handle these responses? If you let me know what your environment is I’ll be happy to take a look to see if I can suggest anything.