Do you want to understand your New Relic data better? New Relic Query Language (NRQL) can help. NRQL lets you analyze your data in real time. Although it can seem overwhelming at first glance, worry not.
In part 1 and part 2, we covered 10 essential NRQL functions and clauses that will help you understand NRQL better and gain deeper insights into your data. We’ve covered percentages, heatmaps, conditional widgets, and a few more functions and clauses that help you analyze your telemetry data with ease.
In the next part of this blog series, we explore existing functions and exciting new additions to NRQL that can assist with data transformations, formatting and visualizations.
Formatting
toDateTime
Use this function to transform a raw timestamp into your preferred formatted datetime string.
This is helpful when dealing with large datasets or logs where the timestamps are not easily readable or in multiple different formats. With this function, you can convert these raw timestamps into a more human-readable format, making it easier to analyze and understand your data.
toDateTime()
takes three arguments, of which two are optional:
-
Timestamp: This should be a raw timestamp in the epoch milliseconds format. for instance
1709787389000
- (optional) Pattern: A string pattern to format the date and time in the specified format. This can vary from
YYYY-MM-dd HH:mm:ss
orYYYY-MMM-dd hh:mm:ss
and many more - (optional) Timezone: An optional third position parameter to specify the timezone. Refer to the references section to get the list of all the supported timezone string values.
Here is an example of a timestamp formatted with yyyy-MM-dd HH:mm:ss z
and the timezone of America/chicago
WITH 1709787389000 as timestampValue
SELECT toDatetime(timestampValue,'yyyy-MM-dd HH:mm:ss z', timezone: 'America/Chicago') as formattedTimeStamp
You can also use the WITH TIMEZONE
clause instead of the timezone as a third parameter.
WITH 1709787389000 as timestampValue
SELECT toDatetime(timestampValue, 'yyyy-MM-dd hh:mm:ss') AS 'Formatted TimeStamp'
FROM Event WITH TIMEZONE 'America/Los_Angeles'
If no pattern & timezone parameters are supplied, the function defaults to the 1970-01-01T00:20:34.567Z
format, which is also in UTC.
References
- epoch milliseconds
- All the Patterns for Formatting and Parsing
- List of all the supported timezone strings
Transformations
jparse
Use this function to parse raw JSON string from events or Logs to a map of structured values which can be referenced directly in the NRQL like a first-class type
jparse()
takes two arguments:
- Attribute: A JSON string value
- (optional) Path: An optional path parameter to parse specific values from the JSON string.
Here’s a simple example with a raw JSON string and extracting the name
field's value:
WITH '{"name": "Avatar", "gender": "M", "age": 14 }' as jsonString
SELECT jparse(jsonString)[name] as Name
if there is no key selector supplied the jparse() function returns a serialized JSON
jparse
can also work with an array string. If you have a simple array, it can be queried the same way as a JSON object and also use the index value as the selector:
WITH '["key1","key2"]' as jsonString
SELECT jparse(jsonString)[0]
The path parameter can be helpful if there’s a nested JSON and multiple values need to be extracted. The following is one example where the array contains a list of users with their names and UserIDs. The path parameter can be provided as a regex to fetch all the IDs for each user.
WITH '{"users": [{"name": "A", "id": 5}, {"name": "B", "id": 10}]}' as jsonString, jparse(jsonString, 'users[*].id') as ids
SELECT ids as UserIDs
References:
For more information, see path syntax reference.
mapKeys
Use this function to extract the list of keys from the provided map or a JSON string. The JSON string can be converted to a map using the jparse()
function.
mapKeys
takes a single argument; attribute: a map dataset.
Here’s a simple example of extracting all the keys from a provided JSON string:
WITH '{"name": "Avatar", "gender": "M", "age": 14 }' as jsonString
SELECT mapKeys(jparse(jsonString)) as keys
The above NRQL query deserializes the JSON, maps and returns all the keys; that is, [’name’,’gender’,age’].
This function also works with nested JSON strings; however, it extracts the first-level keys only:
WITH '{"value1": "test", "value2": {"nestedValue1": [1, 2, 3], "nestedValue2": 100}}' as jsonString
SELECT mapKeys(jparse(jsonString)) AS keys
mapValues
Like the mapKeys
function, mapValues can help extract all the values from the provided map dataset.
mapValues
takes a single argument; attribute: a map dataset.
Let's extract the values from this simple JSON string:
WITH '{"name": "Avatar", "gender": "M", "age": 14 }' as jsonString
SELECT mapValues(jparse(jsonString)) as Values
It works the same way for the nested JSON string, however it only extracts & deserializes the first-level values.
WITH '{"value1": "test", "value2": {"nestedValue1": [1, 2, 3], "nestedValue2": 100}}' as jsonString
SELECT mapValues(jparse(jsonString)) AS values
The nested object value can be viewed by changing the visualization to JSON chart type:
Visualizations
In the previous blog posts of this series, we’ve covered some out-of-the-box visualizations like heat maps, histograms, and map widgets. Here I want to (re)introduce an existing NRQL clause that can help with smoothing the charts or providing more data points for visualizations like line, area, or stacked bar charts.
Slide By
The SLIDE BY
clause allows something called sliding windows. Sliding windows group data into overlapping time windows or time buckets, which is different from cascading windows.
These windows can help to smooth out line graphs, providing a higher resolution for your charts. This feature can be particularly useful when you want to evaluate a larger dataset in a TIMESERIES format with more data in each time bucket.
Syntax
The SLIDE BY
clause can be used together with the TIMESERIES
clause.
The syntax is as follows: SELECT ... TIMESERIES units SLIDE BY units
Usage
In the following example, we query the average duration of transactions SINCE 5 MINUTES AGO
, using a TIMESERIES bucket of 1 minute each. The stacked chart below displays results from 5 buckets or time windows.
SELECT average(duration) FROM Transaction
TIMESERIES 1 minute SINCE 5 minutes ago
The same chart can be displayed without the gaps in each bucket by using the SLIDE BY clause.
SELECT average(duration) FROM Transaction
TIMESERIES 1 minute SLIDE BY 30 seconds
SINCE 5 minutes ago
Here is another example with PageView metrics in a line chart
The TIMESERIES
clause provides more buckets and a larger data set to analyze, but if you want to analyze more data points in the sliding window range, just add the SLIDE BY
clause to the same query, as shown in this example:
FROM PageView SELECT count(*) TIMESERIES 1 minute SLIDE BY 10 seconds
Notice that the bucket size remains the same, but the data points are far higher than before providing you with a higher resolution preview into the PageView
metrics.
References:
- Learn more about sliding window
- Dive deeper into SLIDE BY by watching this Nerd Bytes video.
Conclusion
By mastering NRQL, you can capture, transform and interpret your data, allowing you to break down the big picture into easily understandable pieces. This will help you identify problems as they occur and analyze your data in real time, making valuable use of your telemetry data. Whether you’re looking to optimize performance, identify trends, monitor key metrics to drive business decisions, or simply create dashboards with custom data, NRQL is an indispensable tool that can help you understand your data comprehensively.
次のステップ
What next?
- You can learn NRQL interactively by using our NRQL Lesson app.
- Explore New Relic's Query Builder and build your custom charts & dashboards
- Learn about our NRQL Console, and become an NRQL master
- Coming from the Prometheus world? worry not. New Relic supports PromQL-style queries
本ブログに掲載されている見解は著者に所属するものであり、必ずしも New Relic 株式会社の公式見解であるわけではありません。また、本ブログには、外部サイトにアクセスするリンクが含まれる場合があります。それらリンク先の内容について、New Relic がいかなる保証も提供することはありません。