Splunk Book



Comments



Description

Splunk Overview TrainingDuration: 3 days Skill Level: Introductory and beyond Hands-On Format: This hands-on class is approximately 50% hands-on lab to 50% lecture ratio, combining engaging lecture, demos, group activities and discussions with machine-based practical student labs and project work. Course Overview Are you in charge of creating Splunk knowledge objects for your organization? Then you will benefit from this course that walks you through the various knowledge objects and how to create them. Working with Splunk is a comprehensive hands-on course that teaches students how to search, navigate, tag, build alerts, create simple reports and dashboards in Splunk, and how to Splunk's Pivot interface. Working in a hands-on learning environment, students will learn how to use Splunk Analytics to provide an efficient way to search large volumes of data. Students will learn how to run Basic Searches, Save and Share Search Results, Create Tags and Event Types, Create Reports, Create Different Charts, Perform Calculations and Format Search Data, and Enrich Data with Lookups. Examples will center around financial institution examples. What You’ll Learn: Course Objectives After completion of this Splunk course, you will be able to:  Get insight into Splunk Search App  Learn to save and share Search Results  Understand the use of fields in searching  Learn Search Fundamentals using Splunk  Explore the available visualizations on the software  Create Reports and different Chart Types  Perform Data Analysis, Calculation and Formatting  Understand and execute various techniques of enriching data lookups 1 Recommended Audience & Pre-Requisites This is a technical class for technical people, geared for Users, Administrators, Architects, Developers & Support Engineers who are new to Splunk. This course is ideal for anyone in your organization who need to examine and use IT data. Ideal attendees would include:  Beginners in Splunk who want to enhance their knowledge about this Software usage  System Administrators and Software Developers  Professionals who are eager to learn to search and analyze machine-generated data using a faster and agile software Course Topics & Agenda Course Modules 1-4 Day 1 - Morning Module 1 - Basic Understanding of Architecture (Overview)  What are the components?  Discussion on Forwarders- UF/HF  Common ports for the set up  License Master/Slave relationship  Understanding of Deployment Server and Indexer Module 2 - Introduction to Splunk's User Interface  Understand the uses of Splunk  Define Splunk Apps  Learn basic navigation in Splunk  Hands on Lab covering: Basic Navigation  End of Module Hands-on Quiz Module 3 - Searching  Run basic searches  Set the time range of a search  Hands on Lab covering: Run basic searches, Set the time range of a search  Identify the contents of search results  Refine searches 2         Hands on Lab covering: Identify the contents of search results, Refine searches Use the timeline Work with events Hands on Lab covering: Use the timeline, Work with events Control a search job Save search results Hands on Lab covering: Control a search job, Save search results End of Module Hands-on Quiz Module 4 - Using Fields in Searches  Understand fields  Use fields in searches  Use the fields sidebar  Hands on Lab covering: Understand Fields, Use fields in searches, Use the fields sidebar  End of Module Hands-on Quiz Course Modules 5-7 Day 1 - Afternoon Module 5- Creating Reports and Visualizations  Save a search as a report  Edit reports  Create reports that include visualizations such as charts and tables  Hands on Lab covering: Save a search as a report, Edit Reports, Create reports that include visualizations such as charts and tables.  Add reports to a dashboard  Create an instant pivot from a search  Hands on Lab covering: Add reports to a dashboard, Create an instant pivot from a search.  End of Module Hands on Quiz Module 6 - Working with Dashboards  Create a dashboard  Add a report to a dashboard  Hands on Lab covering: Create a dashboard, Add a report to a dashboard  Add a pivot report to a dashboard 3    Edit a dashboard Hands on Lab covering: Add a pivot report to a dashboard, Edit a dashboard. End of Module Hands on Quiz Module 7 - Search Fundamentals  Review basic search commands and general search practices  Examine the anatomy of a search  Use the following commands to perform searches:  Fields  Table  Rename  Rex  Multikv  Hands on Lab covering: Review basic search commands and general search practices, Examine the anatomy of a search, Use the following commands to perform searches: Fields, Table, Rename, Rex, Multikv.  End of Module Hands on Quiz. Course Modules 8-10 Day 2 – Morning (Deep Dive Topics) Module 8 - Reporting Commands, Part 1  Use the following commands and their functions:  Top  Rare  Hands on Lab covering: Top, Rare  Stats  Add coltotals  Hands on Lab covering: Stats, Add Coltotals  End of Module Hands on Quiz Module 9 - Reporting Commands, Part 2  Explore the available visualizations  Create a basic chart  Split values into multiple series  Hands on Lab covering: Explore the available visualizations, Create a basic chart, Split values into multiple series  Omit null and other values from charts 4  End of Module Hands on Quiz Module 12 . Chart multiple values on the same timeline Format charts Explain when to use each type of reporting command Hands on Lab covering: Format Charts. Explain when to use each type of reporting command. Calculating. Format values  Use conditional statements  Further filter calculated results  Hands on Lab covering: Use conditional statements. Further filter calculated results  End of Module Hands on Quiz Course Modules 11-12 Day 2 – Afternoon (Deep Dive Topics) Module 11 . Perform calculations.Creating Field Extractions  Perform field extractions using Field Extractor  Hands on Lab covering: Perform field extractions using Field Extractor  End of Module Hands on Quiz 5 . Create and use calculated fields.Creating Field Aliases and Calculated Fields  Define naming conventions  Create and use field aliases  Create and use calculated fields  Hands on Lab covering: Define naming conventions.       Create a time chart Chart multiple values on the same timeline Hands on Lab covering: Omit null and other values from charts. Convert values. Create and use field aliases.  Round values  Format values  Hands on Lab covering: Round values. and Formatting Results  Using the eval command  Perform calculations  Convert values  Hands on Lab covering: Using the eval command. Create a time chart.Analyzing. End of Module hands on Quiz Module 10 .  End of Module Hands on Quiz Module 14 .Creating and Managing Alerts  Describe alerts  Create alerts  View fired alerts  Hands on Lab covering: Describe alerts. Create alerts.Afternoon Module 16 .Morning Module 13 . Create and use a basic macro. create and event type. Create a SEARCH workflow action  End of Module Hands on Quiz Module 15 . 6 . Add and use arguments with a macro.Creating Tags and Event Types  Create and use tags  Describe event types and their uses  Create an event type  Hands on Lab covering: Create and use tags. Manage macros.Creating Workflow Actions  Describe the function of a workflow action  Create a GET workflow action  Hands on Lab covering: Describe the function of a workflow action.Course Modules 13-15 Day 3 . Describe event types and their uses.  Define arguments and variables for a macro  Add and use arguments with a macro  Hands on Lab covering: Define arguments and variable for a macro. Create a GET workflow action  Create a POST workflow action  Create a Search workflow action  Hands on Lab covering: Create a POST workflow action. View fired alerts  End of Module Hands on Quiz Course Modules 16-17 Day 3 .Creating and Using Macros  Describe macros  Manage macros  Create and use a basic macro  Hands on Lab covering: Describe macros. Save pivot report as a dashboard.  Create a pivot report  Save pivot report as a dashboard  Hands on Lab covering: Create a pivot report.Using Pivot  Describe Pivot  Understand the relationship between data models and pivot  Select a data model object  Hands on Lab covering: Describe Pivot. each attendee will take a Post Course Quiz that will gauge the student’s retention of the skills and topics covered throughout the course. End of Module Hands on Quiz Module 17 . The quiz will be distributed either on paper or online at the end of class and graded promptly. Select a data model object. 7 .  End of Module Hands on Quiz. Post Course Final Quiz At the end of class. Understand the relationship between data models and pivot. Basic Understanding of Architecture (Overview)      What are the components? Discussion on Forwarders.Module 1 .UF/HF Common ports for the set up License Master/Slave relationship Understanding of Deployment Server and Indexer 8 . Then it indexes the data. several other instances that index the data. The indexer does the heavy lifting. and one or more instances that handle search requests. it runs interactive or scheduled searches on the indexed data. you can deploy lightweight versions of Splunk Enterprise. For a typical mid-size deployment. These specialized instances are known collectively as components. It should reside on a machine by itself. The forwarders. You can split this functionality across multiple specialized instances of Splunk Enterprise. ranging in number from just a few to thousands. for example. 9 . The forwarders consume data locally and then forward the data across the network to another Splunk Enterprise component. on the other hand. First. but for purposes of this discussion. it first parses and then indexes the data. create a deployment with many instances that only consume data. depending on the quantity of data you're dealing with and other variables in your environment. it consumes data from files. (Actually. can easily co-exist on the machines generating the data. There are several types of components. because the data-consuming function has minimal impact on machine performance. or elsewhere. we consider parsing to be part of the indexing process. for example. called the indexer. it indexes the data and runs searches.Section 1-What are the components? Splunk Enterprise performs three key functions as it moves data through the data pipeline. called forwarders. on the machines where the data originates.) Finally. You might. the network. The forwarders automatically switch to sending their data to any indexers that remain alive.This diagram shows several forwarders sending data to a single indexer: As you scale up. you might have hundreds of forwarders sending data to a number of indexers. For a larger deployment. Not only does load balancing help with scaling. so that they distribute their data across some or all of the indexers. You can use load balancing on the forwarders. but it also provides a fail-over capability if one of the indexers goes down. you can add more forwarders and indexers. 10 . each forwarder load-balances its data across two indexers: 11 .In this diagram. in larger deployments. It also searches the indexed data in response to search requests. Forwarder A Splunk Enterprise instance that forwards data to another Splunk Enterprise instance. transforming raw data into events and placing the results into an index. There are three types of forwarders: 12 .These are the fundamental components and features of a Splunk Enterprise distributed environment:     Indexers. such as an indexer or another forwarder. In larger deployments. Indexer A Splunk Enterprise instance that indexes data. although indexers always perform searches across their own data. Search heads. Forwarders. forwarders handle data input and forward the data to the indexer for indexing. a specialized Splunk Enterprise instance. The indexer also frequently performs the other fundamental Splunk Enterprise functions: data input and search management. Similarly. Deployment server. or to a third-party system. called a search head. handles search management and coordinates searches across multiple indexers. Search head clusters are groups of search heads that coordinate their activities. with most features disabled to achieve a small footprint. Units of such content are known as deployment apps. 13 .0. directing search requests to a set of search peers and then merging the results back to the user. to deployment clients. To send event-based data to indexers. you must use a heavy forwarder. Deployment Server A Splunk Enterprise instance that acts as a centralized configuration manager. streamlined version of Splunk Enterprise that contains only the essential components needed to send data. a Splunk Enterprise instance that handles search management functions. The light forwarder has been deprecated as of Splunk Enterprise version 6.  A light forwarder is a full Splunk Enterprise instance. A search head that performs only searching. A universal forwarder is a dedicated. The universal forwarder supersedes the light forwarder for nearly all purposes. The deployment server downloads updated content. and not any indexing. is referred to as a dedicated search head. Its main limitation is that it forwards only unparsed data. Instances that are remotely configured by deployment servers are called deployment clients. grouping together and collectively managing any number of Splunk Enterprise instances. Search Heads In a distributed search environment.0. The universal forwarder is the best tool for forwarding data to indexers.  A heavy forwarder is a full Splunk Enterprise instance. with some features disabled to achieve a smaller footprint. A Splunk Enterprise instance can function as both a search head and a search peer. such as configuration files and apps. you cannot use the universal forwarder to index or search data. 14 . Heavy and light forwarders While the universal forwarder is generally the preferred way to forward data. both heavy and light forwarders are actually full Splunk Enterprise instances with certain features disabled. except that it lacks the ability to perform distributed searches. Much of its default functionality. or alerting capability. Unlike a full Splunk Enterprise instance. indexing. such as Splunk Web. The universal forwarder's sole purpose is to forward data.UF/HF The universal forwarder The universal forwarder is Splunk's new lightweight forwarder.Section 2-Discussion on Forwarders. as an intermediate step before sending the data onwards to an indexer. The universal forwarder does not parse data. which is an entirely separate. can be disabled. to reduce the size of its footprint. You use it to gather data from a variety of inputs and forward the data to a Splunk Enterprise server for indexing and searching. A heavy forwarder parses data before forwarding it and can route data based on criteria such as source or type of event. streamlined executable. To achieve higher performance and a lighter footprint. it has several limitations:   The universal forwarder has no searching. if necessary. you might have reason (legacy-based or otherwise) to use heavy forwarders as well. You can also forward data to another forwarder. Unlike the universal forwarder. A heavy forwarder (sometimes referred to as a "regular forwarder") has a smaller footprint than a Splunk Enterprise indexer but retains most of the capability. 2+) Load balancing? Yes Yes Data cloning? Yes Yes Per-event filtering? No Yes Event routing? No Yes Event parsing? No Yes Local indexing? No Optional.conf Searching/alerting? No Optional Splunk Web? No Optional 15 . with some features disabled Footprint (memory. CPU load) Smallest Medium-to-large (depending on enabled features) Bundles Python? No Yes Handles data inputs? All types (but scripted inputs might require Python installation) All types Forwards to Splunk Enterprise? Yes Yes Forwards to 3rd party systems? Yes Yes Serves as intermediate forwarder? Yes Yes Indexer acknowledgment (guaranteed delivery)? Optional Optional (version 4.This table summarizes the similarities and differences among the three types of forwarders: Features and capabilities Universal forwarder Heavy forwarder Type of Splunk Enterprise instance Dedicated executable Full Splunk Enterprise. by setting indexAndForward attribute in outputs. com/guacamole User name: admin Password: admin  Your instructor will give you your machine number. This port is used to communicate with the splunkd daemon.Common ports for the set up Splunk configures two ports at installation time:   The HTTP/HTTPS port. The login details : username: admin password: admin 16 . The management port. It defaults to 8000. as does the command line interface and any distributed connections from other servers. Splunk Web talks to splunkd on this port. Please remember your machine number throughout the training session. This port defaults to 8089.    Then please go to Start > All Programs > Splunk Enterprise > Splunk Enterprise The Splunk web interface should come up. Let's login to our lab environment Please go to: http://www.uxcreate.Section 3. This port provides the socket for Splunk Web. You can either run a standalone indexer with a license installed locally. When a license master instance is configured.Section 4-License Master/Slave relationship Splunk Enterprise takes in data from sources you designate and processes it so that you can analyze it. We call this process indexing. Any host in your Splunk Enterprise infrastructure that performs indexing must be licensed to do so. the license slave starts a 72 hour timer. If the license slave cannot reach the license master for 72 hours. If the license master is unreachable for any reason. configured as license slaves. the license slaves communicate their usage to the license master every minute. Users cannot search data in the indexes on the license slave until that slave can reach the license master again. 17 . and license slaves are added to it. can draw. Splunk Enterprise licenses specify how much data you can index per calendar day (from midnight to midnight by the clock on the license master). or you can configure one of your Splunk Enterprise instances as a license master and set up a license pool from which other indexers. search is blocked on the license slave (although indexing continues). it can make sense to split off the search management function from indexing. other Splunk Enterprise components take over the non-indexing roles. Usually the best way to do this is to install a lightweight instance of Splunk Enterprise. known as a forwarder. In those cases. along with searching of its indexed data. The primary functions of an indexer are:   Indexing incoming data. known as distributed search. These forwarders handle data input and send the data across the network to the indexer residing on its own machine. on each of the event-generating machines. which need to go to a central indexer for consolidation. but the search heads manage the overall search process across all the indexers and present the consolidated search results to the user. Searching the indexed data. The indexers still perform the actual searching of their own indexes. For larger-scale needs. Similarly. one or more search heads distribute search requests across multiple indexers. the indexer might reside on its own machine and handle only indexing.Section 5-Understanding of Deployment Server and Indexer The indexer is the Splunk Enterprise component that creates and manages indexes. indexing is split out from the data input function and sometimes from the search management function as well. In these larger. distributed deployments. the indexer also handles the data input and search management functions. In this type of scenario. you might have a set of Windows and Linux machines generating events. In single-machine deployments consisting of just one Splunk Enterprise instance. For instance. in cases where you have a large amount of indexed data and numerous concurrent users searching on it. 18 . The forwarder management interface offers an easy way to create.Here's an example of a scaled-out deployment: A deployment server uses server classes to determine what content to deploy to groups of deployment clients. 19 . and manage server classes. edit. Introduction to Splunk's User Interface      Understand the uses of Splunk Define Splunk Apps Learn basic navigation in Splunk Hands on Lab covering: Basic Navigation End of Module Hands-on Quiz 20 .Module 2 .       Collects and indexes log and machine data from any source Powerful search. security and availability to suit any organization Available as a software or SaaS ( Software as a Solution) solution 21 . any user can quickly discover and share insights. By monitoring and analyzing everything from customer clickstreams and transactions to security events and network activity. IT ops. security systems and business applications—giving you the insights to drive operational performance and business results. visualization and pre-packaged content for use-cases. analysis and visualization capabilities empower users of all types Apps provide solutions for security. cloud and hybrid environments Delivers the scale. Just point your raw data at Splunk Enterprise and start analyzing your world. Splunk Enterprise helps you gain valuable Operational Intelligence from your machine-generated data.Section 1-Understand the uses of Splunk Splunk Enterprise makes it simple to collect. analyze and act upon the untapped value of the big data generated by your technology infrastructure. And with a full range of powerful search. business analysis and more Enables visibility across on premise. you can use Splunk Apps to gain the specific insights you need from your machine data. thus providing a level of control when you are deploying and sharing Apps across your organization. Apps can be opened from the Splunk Enterprise Home Page. or from the Apps section of Settings. 22 . You can also apply user/role based permissions and access controls to Splunk Apps. panels and UI elements powered by saved searches and packaged for a specific technology or use case to make Splunk immediately useful and relevant to different roles. As an alternative to using Splunk for searching and exploring.Section 2-Define Splunk Apps A Splunk App is a prebuilt collection of dashboards. from the App menu. you can drag and drop the apps within the workspace to rearrange them. you see one App in the workspace: Search & Reporting. You can do two actions on this panel:  Click the gear icon to view and manage the apps that are installed in your Splunk instance. the Explore Splunk Enterprise panel. 23 .Section 3-Learn basic navigation in Splunk About SplunkHome Splunk Home is your interactive portal to the data and apps accessible from this Splunk instance. Apps The Apps panel lists the apps that are installed on your Splunk instance that you have permission to view. and a custom default dashboard (not shown here). the Apps menu. Select the app from the list to open it. When you have more than one app. The main parts of Home include the Splunk Enterprise navigation bar. For an out-of-the-box Splunk Enterprise installation. manage and edit your Splunk configuration. About the Splunk bar Use the Splunk bar to navigate your Splunk instance. and monitor the progress of search jobs. browse for new apps. also includes an App menu next to the Splunk logo. open the Splunk Enterprise Documentation. The Splunk bar in another view. Click the plus icon to browse for more apps to install. The following screenshot shows the Splunk bar in Splunk Home. 24 . or open Splunk Answers. view system-level messages. such as the Search & Reporting app's Search view. It appears on every page in Splunk Enterprise. You can use it to switch between apps. Click on the icons to open the Add Data view. Explore Splunk Enterprise The options in the Explore Splunk Enterprise panel help you to get started using Splunk Enterprise. and Authentication settings. If you do not see some of these options. Settings menu The Settings menu lists the configuration pages for Knowledge objects. Distributed environment settings.Return to Splunk Home Click the Splunk logo on the navigation bar to return to Splunk Home from any other view in Splunk Web. System and licensing. 25 . you do not have the permissions to view or edit them. Data. select a default app for this account. You can change this display name by selecting Edit account and changing the Full name. Click the X to remove the message. where you can view and manage currently running searches. a notification displays as a count next to the Messages menu. Activity menu The Activity menu lists shortcuts to the Jobs. When there is a new message to review. Messages menu All system-level error messages are listed here. You can also edit the time zone settings. and System Activity views. and change the account's password. 26 . The User menu is also where you Logout of this Splunk installation. Triggered alerts.  Click Jobs to open the search jobs manager window.User menu The User menu here is called "Administrator" because that is the default user name for a new installation. These saved objects include Reports. Dashboards. For example. Find Use Find to search for objects within your Splunk Enterprise instance. 27 . Click Triggered Alerts to view scheduled alerts that are triggered. if you type in "error". it returns the saved objects that contain the term "error". Help Click Help to see links to Video Tutorials. and online Documentation. Splunk Answers. This tutorial does not discuss saving and scheduling alerts. The results appear in the list separated by the categories where they exist. and Data models. the Splunk Support Portal. labels.  Click System Activity to see Dashboards about user activity and status of the system. and descriptions in saved objects. See "About alerts" in the Alerting Manual. Find performs non-case sensitive matches on the ID. Alerts. You can also run a search for error in the Search & Reporting app by clicking Open error in search. Hands on Lab covering: Basic Navigation Take your time exploring the Splunk Web interface 28 . End of Module Hands-on Quiz Please refer to your virtual machine for test 29 . Work with events Control a search job Save search results Hands on Lab covering: Control a search job. Refine searches Use the timeline Work with events Hands on Lab covering: Use the timeline.Module 3 . Set the time range of a search Identify the contents of search results Refine searches Hands on Lab covering: Identify the contents of search results. Save search results End of Module Hands-on Quiz 30 .Searching              Run basic searches Set the time range of a search Hands on Lab covering: Run basic searches. and the results are typically a list of raw events. These are searches where you first retrieve events from an index and then pass them into one or more search commands. correlating events. Transforming searches Transforming searches are searches that perform some type of statistical calculation against a set of results. itself). These searches will always require fields and at least one of a set of statistical commands. Raw event searches Raw event searches are searches that just retrieve events from an index or indexes and are typically done when you want to analyze a problem. you should ask what you are trying to accomplish. Some examples include: getting a daily count of error events. and analyzing failures. counting the number of times a specific user has logged in. you want to:   Investigate to learn more about the data you just indexed or to find the root cause of an issue. Summarize your search results into a report. after getting data into Splunk. whether tabular or other visualization format. Some examples of these searches include: checking error codes. you might hear us refer to two types of searches: Raw event searches and Report-generating searches. These searches do not usually include search commands (except search. Generally. 31 . investigating security issues. Because of this. or calculating the 95th percentile of field values.Run basic searches Types of searches Before delving into the language and syntax of search. 32 . Some examples of these searches include: counting the number of errors that occurred or finding all events from a specific host. and edit this knowledge about fields. Search and knowledge As you search. You've probably heard these referred to as 'needle in a haystack' or "rare term" searches. you can use. add. events. Whatever you learn. You can configure Splunk to recognize these new fields as you index new data or you can create new fields as you search. you may begin to recognize patterns and identify more information that could be useful as searchable fields.Information density Whether you're retrieving raw events or building a report. you should also consider whether you are running a search for sparse or dense information:   Sparse searches are searches that look for single event or an event that occurs infrequently within a large set of data. and transactions to your event data. Dense searches are searches that scan through and report on many events. This capturing of knowledge helps you to construct more efficient searches and build more detailed reports. Some examples of these searches include: searching for a specific and unique IP address or error code. "fields . let's take a look at the following search. representing the results of the top command. it helps to visualize all your indexed data as a table. count.percent The Disk represents all of your indexed data and it's a table of a certain size with columns represent fields and rows representing events. The first intermediate results table shows fewer rows--representing the subset of events retrieved from the index that matched the search terms "sourcetype=syslog ERROR". For example. sourcetype=syslog ERROR | top user | fields . 33 . Each search command redefines the shape of your table.percent" removes the column that shows the percentage. and percentage. which summarizes the events into a list of the top 10 users and displays the user. "top user". Then.The anatomy of a search To better understand how search commands act on your data. The second intermediate results table shows fewer columns. so you are left with a smaller final results table. and count. If Splunk does not recognize a backslash sequence.Quotes and escaping characters Generally. 34 . For example:    The sequence \| as part of a search will send a pipe character to the command.. quotes. pipes. you want to use quotes around keywords and phrases if you don't want to search for their default meaning. The \\ sequence will be available as a literal backslash in the command. For example:   A search such as error | stats count will find the number of events containing the string error. Additionally. pipes. For example:   A search for the keyword AND without meaning the Boolean operator: error "AND" A search for this field/value phrase: error "startswith=foo" The backslash character (\) is used to escape quotes. Quotes must be balanced. such as Boolean operators and field/value pairs. a pipe.   For example \s in a search string will be available as \s to the command. it will not alter it. you need quotes around phrases and field values that include white spaces. commas. for example for searching for a literal quotation mark or inserting a literal quotation mark into a field using rex. and/or brackets. an opening quote must be followed by an unescaped closing quote. in the search string \\s will be available as \s to the command. The sequence \" will send a literal quote to the command. because \s is not a known escape sequence. because \\ is a known escape sequence that is converted to \. | search "error | stats count" would return the raw events containing error. However. Backslash escape sequences are still expanded inside quotes.. and itself. A search such as . in that order. instead of having the pipe split between commands. stats. . | eval myfield="6" Example 2: myfield is created with the value of ".*\*. Splunk treats the asterisk character as a major breaker. | eval myfield="\\" Example 4: This would produce an error because of unbalanced quotes..... you will need to run a postfiltering regex search on your data: index=_internal | regex ". *. Because of this. .. . can not be searched for using a backslash to escape the character.*" Examples Example 1: myfield is created with the value of 6. | eval myfield="\" 35 .Asterisks.. If you want to search for the asterisk character. .. | eval myfield="\"" Example 3: myfield is created with the value of \.. it will never be in the index. Set the time range of a search Time is crucial for determining what went wrong. if not exactly what happened. create custom time ranges. You can restrict a search with preset time ranges. You often know when something happened. Note: If you are located in a different timezone. Select time ranges to apply to your search Use the time range picker to set time boundaries on your searches. Looking at events that happened around the same time can help correlate results and find the root cause. time-based searches use the timestamp of the event from the instance that indexed the data. Searches run with overly-broad time range wastes system resources and produces more results than you can handle. specify time ranges based on date or date and time. These options are described in the following sections. or work with advanced features in the time range picker. 36 . Select from a list of Preset time ranges 37 . and so on. You can select from the list of time range units.Define custom Relative time ranges Use custom Relative time range options to specify a time range for your search that is relative to Now. "Seconds ago". "Minutes ago". 38 . The labels for Earliest and Latest update to match your selection. The preview boxes below the fields update to the time range as you set it. 39 . Define custom Real-time time ranges The custom Real-time option enables you to specify the start time for your real-time time range window. 40 . For these fields. Before a date. and Since a date. You can choose among options to return events: Between a beginning and end date. you can type the date into the text box or select the date from a calendar: 41 .Define custom Date ranges Use the custom Date Range option to specify calendar dates in your search. 42 . 43 . You can type the date into the text box or select the date from a calendar.Define custom Date & Time ranges Use the custom Date & Time Range option to specify calendar dates and times for the beginning and ending of your search. This timestamp is displayed under the text field so that you can verify your entry.Use Advanced time range options Use the Advanced option to specify the earliest and latest search times. 44 . You can write the times in Unix (epoch) time or relative time notation. The epoch time value you enter is converted to local time. . such as a file or port.. and writes each event into an index on disk. see the “Concepts” section near the end of this document. code=404. classifies that source into a sourcetype (e. Processing at the time the data is processed: Splunk reads data from a source. alerts. "access_combined".g.. and the event is classified by matched against eventtype definitions (e. 'login'.g.. on a host (e. Processing at the time the data is searched: When a search starts.) are extracted from the event's text. The events returned from a search can then be powerfully transformed using Splunk's search language to generate reports that live on dashboards. 'error'. so try to pay attention.). If you want more details.Hands on Lab Part 1 . 45 ... then extracts timestamps..g..g. . …). matching indexed events are retrieved from disk. breaks up the source into individual events (e.. for later retrieval with a search. which can be a single-line or multiple lines. fields (e..g. "my machine"). I’ll cover them in a few sentences.Basic Concepts There are a few concepts in the Splunk world that will be helpful for you to understand. "apache_error". log events.). "syslog".. . user=david. websample. keeping track of changes to them as they happen.Part 2 . Click Start Searching Assuming all goes well. and all the events are timestamped and searchable. Browse and find "websample. 2. 6. 8. Go to the Splunk Web interface (e.g.log is now indexed. ports. We're going to start simple and just tell Splunk to index a particular file and not monitor it for updates: 1. Accept all the default values and just click Submit. 5. directories. http://localhost:8000) and log in. click Add Data. Click Select File.Adding Data Splunk can eat data from just about any source.log" on your Desktop that we previously saved. and scripts. if you haven’t already. 7. including files. Under Settings. Click Settings in the upper right-hand corner of Splunk Web. 4. Click Upload Data to upload file. 3. 46 . e. webpage not found).) After logging into Splunk. (More apps can be downloaded and advanced users can built them themselves. if you want to find events that might be HTTP 404 errors (i. To begin your Splunk search. We'll start out simple and work our way up. type in terms you might expect to find in your data. which is the interface for generic searching. but the only relevant one now is the 'Search' app. 47 . For example. type in the keywords: http 404 You'll get back all the events that have both HTTP and 404 in their text.Part 3 -Basic Searching Splunk comes with several Apps. select the Search app and let's get started in searching.. Let's make the search narrower: http 404 "like gecko" 48 . The search was the same as "http AND 404".Notice that search terms are implicitly AND'd together. you could try: http (40* OR 50*) 49 . as well as parentheses to enforce grouping. the previous search is the same as http AND NOT (200 OR 403 OR 404) Splunk supports the asterisk (*) wildcard for searching. use this: http NOT (200 OR 403 OR 404) Again.e.. and NOT (must be capitalized). the AND operator is implied. For example. not including 403 or 404. to retrieve events that has 40x and 50xx classes of HTTP status codes. OR.Using quotes tells Splunk to search for a literal phrase “like gecko”. not 200 error code). which returns more specific results than just searching for “like” and “gecko” because they must be adjacent as a phrase. To get all HTTP error events (i. Splunk supports the Boolean operators AND. csv located on your desktop 2. Search for entries that contain the word " divorced" 3.When you index data.. To narrow results with a search. >=. and intermediate users can add their own extraction rules for pulling out additional fields.e. and <. Search for entries that are divorced and renting 50 . Upload the file LoanStats3a. and <= for numeric fields..Now here’s your turn on your own: 1. In addition to <attribute>=<value>..e. Part 4. webserver events) and that have a status code of 404. attributes) to each of your events. "http 404") because it will only return events that come from access_combined sources (i. It does this based on some text patterns commonly found in IT data. The “404” has to be found where a status code is expected on the event and not just anywhere. just add attribute=value to your search: sourcetype=access_combined status=404 This search shows a much more precise version of our first search (i. you can also do != (not equals). which is different than just having a 404 somewhere in the text.e. >. Splunk automatically adds fields (i. Part 5 - Search App Now click on Search on the Main toolbar You will get the following screen: 51 Click on the Data Summary button, you will get: Click on the Sources tab, you will get: 52 Now you can choose websample.log, you will get: 53 Part 6 - Let’s upload another sample file: 1. 2. 3. 4. Please upload sampledata.zip, whichh is located on the Desktop Notice there is no preview. Please take the defaults and start Searching On the Sourcetypes panel, click access_combined_wcookie 54 You are a member of the Customer Support team for the online Flower & Gift shop. This is your first day on the job. You want to learn some more about the shop. Some questions you want answered are:      What does the store sell? How much does each item cost? How many people visited the site? How many bought something today? What is the most popular item that is purchased each day? It's your first day of work with the Customer Support team for the online Flower & Gift shop. You're just starting to dig into the Web access logs for the shop, when you receive a call from a customer who complains about trouble buying a gift for his girlfriend--he keeps hitting a server error when he tries to complete a purchase. He gives you his IP address, 10.2.1.44. 55 Splunk's search assistant opens.44 As you type into the search bar. These contextual matches are based on what's in your data.1.2. Type the customer's IP address into the search bar: sourcetype="access_combined_wcookie" 10. The entries under matching terms update as you continue to type because the possible completions for your term change as well.1. Search assistant shows you typeahead. or contextual matches and completions for each keyword as you type it into the search bar. 56 . Time Ranges Try different time ranges like the previous week within the search toolbar 57 .Part 7 . OR. When you include Boolean expressions in your search.Identify the contents of search results and refine searches Splunk supports the Boolean operators: AND. the operators have to be capitalized. Also you can mouse over results to refine searches 58 . and NOT. Search for the word : Status 3. Without the use of fields. 2. find the status of Not Paid and Not Mortgage 59 .Hands on Lab 1. Then click on the word Paid and add to the search 4. Remember click on Search on the Toolbar and then click on the Data Summary Button.csv. Click on the word : RENT and exclude from search BONUS LAB: 1. Please choose the Data Source LoanStats3a. or days). where the range is broken up into smaller time intervals (such as seconds. Here. minutes. 60 . Change the timeline format The timeline is located in the Events tab above the events listing. and the count of events for each interval appears in column form. it just filters the results from the previous search. hours. Mouseover a bar to see the count of events.Use the timeline The timeline is a visual representation of the number of events returned by a search over a selected time range. Drilling down in this way does not run a new search. The timeline is a type of histogram. Click on a bar to drill-down to that time range. You can use the timeline to highlight patterns or clusters of events or investigate peaks (spikes in activity) and lows (possible server downtime) in event activity. When you use the timeline to display the results of real-time searches. It shows the count of events over the time range that the search was run. the timeline represents the sliding time range window covered by the real-time search. the timeline shows web access events over the Previous business week. 61 . You can also toggle the timeline scale between linear (Linear Scale) or logarithmic (Log Scale).Format options are located in the Format Timeline menu: You can hide the timeline (Hidden) and display a Compact or Full view of it. When Full is selected, the timeline is taller and displays the count on the y-axis and time on the x-axis. Zoom in and zoom out to investigate events Zoom and selection options are located above the timeline. At first, only the Zoom Out option is available. The timeline legend is on the top right corner of the timeline. This indicates the scale of the timeline. For example, 1 minute per column indicates that each column represents a count of events during that minute. Zooming in and out changes the time scale. For example, if you click Zoom Out the legend will indicate that each column now represents an hour instead of a minute. When you mouse over and select bars in the timeline, the Zoom to Selection or Deselect options become available. 62 Mouse over and click on the tallest bar or drag your mouse over a cluster of bars in the timeline. The events list updates to display only the events that occurred in that selected time range. The time range picker also updates to the selected time range. You can cancel this selection by clicking Deselect. When you Zoom to Selection, you filter the results of your previous search for your selected time period. The timeline and events list update to show the results of the new search. 63 You cannot Deselect after you zoomed into a selected time range. But, you can Zoom Out again. 64 Work with events An event is a single piece of data in Splunk software, similar to a record in a log file or other data input. When data is indexed, it is divided into individual events. Each event is given a timestamp, host, source, and source type. Often, a single event corresponds to a single line in your inputs, but some inputs (for example, XML logs) have multiline events, and some inputs have multiple events on a single line. When you run a successful search, you get back events. 65 Hands on Lab Back at the Flower & Gift shop, let's continue with the customer (10.2.1.44) you were assisting. He reported an error while purchasing a gift for his girlfriend. You confirmed his error, and now you want to find the cause of it. Continue with the last search, which showed you the customer's failed purchase attempts. 1. Type purchase into the search bar and run the search: sourcetype="access_combined_wcookie" 10.2.1.44 purchase When you search for keywords, your search is not case-sensitive and Splunk retrieves the events that contain those keywords anywhere in the raw text of the event's data Use Boolean operators If you're familiar with Apache server logs, in this case the access_combined format, you'll notice that most of these events have an HTTP status of 200, or Successful. These events are not interesting for you right now, because the customer is reporting a problem. Splunk supports the Boolean operators: AND, OR, and NOT. When you include Boolean expressions in your search, the operators have to be capitalized. 2. Use the Boolean NOT operator to quickly remove all of these Successful page requests. Type in: 66 sourcetype="access_combined_wcookie" 10.2.1.44 purchase NOT 200 The AND operator is always implied between search terms. So the search in Step 5 is the same as: sourcetype="access_combined_wcookie" AND 10.2.1.44 AND purchase NOT 200 You notice that the customer is getting HTTP server (503) and client (404) errors. But, he specifically mentioned a server error, so let's quickly remove events that are irrelevant. Another way to add Boolean clauses quickly and interactively to your search is to use your search results. Splunk lets you highlight and select any segment from 67 Search for: In the last topic.44 purchase NOT 200 NOT 404 The location of each bar on the timeline corresponds to an instance when the events that match your search occurred. no events were found then. you really just focused on the search results listed in the events viewer area of this dashboard. A tooltip pops up and displays the number of events that Splunk found during the time span of that bar (1 bar = 1 hour). 2.2. let's take a look at the timeline. sourcetype="access_combined_wcookie" 10. Mouse over one of the bars.1.Timeline Usage Continue with the last search. 1. If there are no bars at a time period. which showed you the customer's failed purchase attempts. Now. 68 . Instead.The taller the bar. 69 . Splunk runs the search again and retrieves only events during that one hour span you selected. so let's narrow the search down more. Often seeing spikes in the number of events or no events is a good indication that something has happened. This updates your search results to show you only the events at the time span. One hour is still a wide time period to search. the more events occurred at that time. You can still select other bars at this point. 3. for example the tallest bar. Double-click on the same bar. Click one of the bars. 4. Splunk does not run the search when you click on the bar. it gives you a preview of the results zoomed-in at the time range. (You'll see more of the time range picker later. that happened during this second. this updates your search to now retrieve events during that one minute span of time. 6. Double-click another bar. you want to expand your search to see everything else. Each bar represents the number of events for one second of time.You should see the same search results in the Event viewer. but.) Also. Once again. notice that the search overrides the time range picker and it now shows "Custom time". Without changing the time range. if anything. 5. replace your previous search in the search bar with: * Splunk supports using the asterisk (*) wildcard to search for "all" or to retrieve events based on parts 70 . each bar now represents one minute of time (1 bar = 1 min). Now. of a keyword. Up to now. This search tells Splunk that you want to see everything that occurred at this time range: 71 . you've just searched for Web access logs. Opens a separate window and display information and metrics for the search job using the Search Job Inspector. or which has finalized. paused. click Job and choose from the available options there. you can access and manage information about the search's job without leaving the Search page. where you can change the job read permissions. Use this to delete a job that is currently running.Control search job progress After you launch a search. You can select this action while the search is running or after it completes. Select this if the search job is slow to complete and you would like to run the job in the background while you work on other Splunk activities (including running a new search job). You can:     Edit the job settings. Send the job to the background. or finalized. and get a URL for the job that you can use to share the job with others or put a link to the job in your browser's bookmark bar. Select this to open the Job Settings dialog. is paused. extend the job lifetime. After you have deleted the job you can still save the search as a report. Delete the job. 72 . Inspect the job. Once your search is running. or you can set it to return as much event information as possible (Verbose mode). In Smart mode (the default setting) it automatically toggles search behavior based on the type of search you're running.Change the search mode The Search mode controls the search experience. 73 . You can set it to speed up searches by cutting down on the event data it returns (Fast mode). the alert is triggered.. Dashboard Panel. 74 . Alerts run saved searches in the background (either on a schedule or in real time). Export. Event Type Event types let you classify events that have common characteristics.. Alert. You can run the report again on an ad hoc basis by finding the report on the Reports listing page and clicking its name. you can save it as a report. When the search returns results that meet a condition you have set in the alert definition. Dashboard Panel. Other search actions Between the job progress controls and search mode selector are three buttons which enable you to Share.Save the results The Save as menu lists options for saving the results of a search as a Report. If the search doesn't include a pipe operator or a subsearch .     Report: If you would like to make the search available for later use. and Print the results of a search. and Event type. Alert Click to define an alert based on your search. you can use this to save it as an event type.: Click this if you'd like to generate a dashboard panel based on your search and add it to a new or existing dashboard.    Click Share to share the job. raw events. or JSON and specify the number of results to export. Click Export to export the results. When you select this. 75 . You can select to output to CSV. Click Print to send the results to a printer that has been configured. the job's lifetime is extended to 7 days and read permissions are set to Everyone. XML. 2.Hands on Lab 1.csv. save your last search as an event type Go to Settings. Using your file LoanStats3a. and click on event types to view your saved event type 76 . End of Module Hands-on Quiz Please refer to your virtual machine for test 77 . Use the fields sidebar End of Module Hands-on Quiz 78 .Using Fields in Searches      Understand fields Use fields in searches Use the fields sidebar Hands on Lab covering: Understand Fields. Use fields in searches.Module 4 . delimited position on the line) or a name and value pair. Often. where there is a single value to each field name. While the From field will contain only a single email address. fields are searchable name and value pairings that distinguish one event from another because not all events will have the same fields and field values. Fields let you write more tailored searches to retrieve the specific events that you want. a field is a value (with a fixed. that is. In Splunk Enterprise. _time for the timestamp of an event.Understand fields Fields exist in machine data in many forms. One of the more common examples of multivalue fields is email address fields. the To and Cc fields have one or more email addresses associated with them. it can appear more than once in an event and has a different value for each appearance. and host for domain name of a server. Some examples of fields are clientip for IP addresses accessing your Web server. 79 . A field can be multivalued. 1. or access_combined_wcookie. such as:     IP addresses for the users accessing the website. Field names are case sensitive. but field values are not. Go to the Search dashboard and type the following into the search bar: sourcetype="access_*" This indicates that you want to retrieve only events from your web access logs and nothing else.Use fields in searches Use the following syntax to search for a field: fieldname="fieldvalue" . HTTP status codes for each page request. 2. scroll through the list of events. Apache web access logs are formatted as access_common. If you are familiar with the access_combined format of Apache logs. you recognize some of the information in each event. access_combined. URIs and URLs for the pages requested and referring pages. 80 . In the Events tab. GET or POST page request methods. sourcetype is a field name and access_* is a wildcarded field value used to match any Apache web access event. respectively. and sourcetype are selected. The Select Fields dialog box opens. Some of these fields are based on each event's timestamp (everything beginning with date_*). 81 .Use the fields sidebar To the left of the events list is the Fields sidebar. The default fields host. Click All Fields. Selected Fields are the fields that appear in your search results. You see the default fields that Splunk defined. where you can edit the fields to show in the events list. the Fields sidebar updates with Selected fields and Interesting fields. 3. These are the fields that Splunk Enterprise extracted from your data. punctuation (punct). You can hide and show the fields sidebar by clicking Hide Fields and Show Fields. source. As Splunk Enterprise retrieves the events that match your search. and location (index). Other field names apply to the web access logs. This opens the field summary for the action field. and that the action field appears in 49. In this set of search results. They are extracted at search time. 82 . Splunk Enterprise found five values for action. and status. there are clientip. These are not default fields.9% of your search results. method. For example. you told Splunk to only retrieve events from your web access logs and nothing else. If you're familiar with the access_combined format of Apache logs. the wildcarded value is used to match all field values beginning with access_ (which would include access_common.Hands on Lab 1. Page request methods. Select Other > Yesterday from the time range picker: sourcetype="access_*" You were actually using fields all along! Each time you searched for sourcetype=access_*. To search for a particular field. such as: • • • • IP addresses for the users accessing the website. Go back to the Search dashboard and search for web access activity. you will recognize some of the information in each event. and access_combined_wcookie) . URIs and URLs for the page request and referring page. specify the field name and value: fieldname="fieldvalue" is a field name and access_combined_wcookie is a field value. Scroll through the search results. HTTP status codes for each page request. Here. sourcetype Note: Field names are case sensitive. access_combined. 83 . but field values are not! 2. These are the fields that Splunk extracted from your data. These are not default fields. and status. 4. • Selected Fields are the fields you picked (from the available fields) to show in 84 . You should recognize the field names that apply to the Web access logs. they have (most likely) been extracted at search time. there's clientip. source. Notice that default fields host. the Fields sidebar updates with selected fields and interesting fields. method. • Available Fields are the fields that Splunk identified from the events in your current search (some of these fields were listed under interesting fields). For example.As Splunk retrieves these events. Click the Edit link in the fields sidebar. Scroll through interesting fields to see what else Splunk extracted. The Fields dialogue opens and displays all the fields that Splunk extracted. and sourcetype are selected fields and are displayed in your search results: 3. and location (index). and product_id. punctuation (punct). You should also see other default fields that Splunk defined--some of these fields are based on each event's timestamp (everything beginning with date_*). For example. there are action. and sourcetype are selected). 5. Scroll through the list of Available Fields. But.your search results (by default. you should also notice other extracted fields that are related to the online store. source. host. 85 . From conversations with your coworker. You're already familiar with the fields that Splunk extracted from the Web access logs based on your search. category_id. you may know that these fields are: Field name action Description what a user does at the online shop. From the Available fields list. product_id the catalog number of the product the user is viewing or buying. select action. 7.category_id the type of product a user is viewing or buying. and product_id. Click Save. When you return to the Search view. 86 . Different events will have different fields. category_id. 6. the fields you selected will be included in your search results if they exist in that particular event. and 9 for product_id.The fields sidebar doesn't just show you what fields Splunk has captured from your data. What are some of these values? 8. it tells you that the action field appears in 71% of your search results. This 87 . there are 2 for action. in this set of search results. This window tells you that. 5 for category_id. Under selected fields. click action for the action field. For the fields you just selected. This doesn't mean that these are all the values that exist for each of the fields--these are just the values that Splunk knows about from the results of your search. It also displays how many values exist for each of these fields. Splunk found two values for action and they are purchase and update. This opens the field summary for the action field. Also. category_id (what types of products the shop sells) and product_id (specific catalog names for products). to see what people are buying. plants. Now you know a little bit more about the information in your data relating to the online Flower and Gift shop. category_id and product_id. Use fields to run more targeted searches These next two examples compares the results when searching with and without fields. and balloons. gifts. perhaps).means that three-quarters of the Web access events are related to the purchase of an item or an update (of the item quantity in the cart. candy. Close this window and look at the other two fields you selected. Example 1 Return to the search you ran to check for errors in your data. Let's use these fields. 9. Select Other > Yesterday from the time range picker: error OR failed OR severe OR (sourcetype=access_* (404 OR 500 OR 503)) 88 . The online shop sells a selection of flowers. When you add fields to your search. Example 2 Before you learned about the fields in your data. use fields in your search. The HTTP error codes are values of the status field. search assistant shows you both "flower" and "flowers' in the typeahead. Splunk matches the raw text of your data. Splunk looks for events that have those specific field/value pairs. but this time. Since 89 . the second search returns fewer events. When you run simple searches based on arbitrary keywords. you might have run this search to see how many times flowers were purchased from the online shop: sourcetype=access_* purchase flower* As you typed in "flower". Now your search looks like this: error OR failed OR severe OR (sourcetype=access_* (status=404 OR status=500 OR status=503)) Notice the difference in the count of events between the two searches--because it's a more targeted search.Run this search again. you'll see that some of the events have action=update and category_id that have a value other than flowers. 90 .you don't know which is the one you want. even though you still used the wildcarded word "flower*". If you scroll through the (many) search results. Select Other > Yesterday from the time range picker: sourcetype=access_* action=purchase category_id=flower* For the second search. These are not events that you wanted! Run this search instead. you use the wildcard to match both. there is only one value of category_id that it matches (FLOWERS). Using fields find entries that annual salary is less than 20. Bring up the Loan data file 2. Refine the search for the field emp_title where it equals Walmart 91 . Now on your own: 1.000 and they live in the state of CA.Notice the difference in the number of events that Splunk retrieved for each search. the second search returns significantly fewer events. Searches with fields are more targeted and retrieves more exact matches against your data. Use addr_state for state 3. End of Module Quiz Please refer to your virtual machine for test 92 . Module 5. Edit Reports. Create an instant pivot from a search. End of Module Hands on Quiz 93 . Create reports that include visualizations such as charts and tables. Add reports to a dashboard Create an instant pivot from a search Hands on Lab covering: Add reports to a dashboard.Creating Reports and Visualizations         Save a search as a report Edit reports Create reports that include visualizations such as charts and tables Hands on Lab covering: Save a search as a report. Enter a Title (or name) for your report. you need to do the following: 1. click on the Report link. 3. This opens the Save As Report dialog: From here. Indicate if you'd like to include the Splunk Time Range Picker as a part of your report. 94 . Enter an optional Description to remind users what your report does. 2.Save a search as a report To save your search as a report. Schedule. View the report. Add (the report) to Dashboard. Acceleration. Splunk prompts you to either review Additional Settings for your newly created report (Permissions.Once you click Save. or Continue Editing the search: 95 . and Embed). single values. you can make the report read only or writeable (can be edited). you do this by copying a Splunk generated URL into an HTML-based web page. tables. an interval like every week. you can embed scheduled reports in external (non-Splunk) websites. With report embedding. by app. Acceleration: Not all saved reports qualify for acceleration and not all users (not even admins) have the ability to accelerate reports. charts. and portals.The additional settings that can be made to the report are given as follows:     Permissions: Allows you to set how the saved report is displayed: by owner. In addition. or for all apps. Generally speaking. They use the same formatting as the originating report. on Monday at 6 AM. For example. Embedded reports can display results in the form of event views. dashboards. and for a particular time range. Embed: Report embedding lets you bring the results of your reports to large numbers of report stakeholders. or any other visualization type. Schedule: Allows you to schedule the report (for Splunk to run/refresh it based upon your schedule). Splunk Enterprise will build a report acceleration summary for the report if it determines that the report would benefit from summarization (acceleration). maps. 96 . When you embed a saved report. 97 . If you've entered the report to review its results. and click Open in Search or Open in Pivot (you'll see one or the other depending on which tool you used to create the report). and acceleration settings. a Save button will be enabled towards the upper right of the report. pivot setup.Edit reports You can easily edit an existing report. You can also edit its description. permissions. You can edit a report's definition (its search string. After you rerun the report. time range. click Edit and select Open in Search or Open in Pivot (you'll see one or the other depending on which tool you used to create the report). depending on whether you're on the Reports listing page or looking at the report itself. go to the Actions column. or report formatting. To edit a report's definition If you want to edit a report's definition. Click this to save the report. locate the report you want to edit. or result formatting). you can change the search string. schedule. there are two ways to start.   If you're on the Reports listing page. You also have the option of saving your edited search as a new report. Edit the definition of a report opened in Search After you open a report in search. however. a visualization can also be non-graphical.Create reports that include visualizations such as charts and tables A visualization is a representation of data returned from a search. a panel contains one or more visualizations. Visualizations available for simple XML dashboards include:      chart event listing map table single value A chart visualization has several types:           area bar bubble column filler gauge line marker gauge pie radial gauge scatter 98 . In dashboards. Most visualizations are graphical representations. then click the Data Summary button: 2.Hands on Lab covering 1. Choose the SourceType Tab. and click on access_combined_wcookie: 99 . Click Search on the Toolbar. top values: 100 .3. category_id . Then click under Reports. Select under Interesting Fields. 101 . It should yield a report: 102 .4. 5.csv file. notice the table of values: 6. Under the Bar Chart drop. investigate all the different chart types as well Bonus Lab: Using the LoanStats3a. under Format . create a report from the data that top values across all the states 103 . Now click on Statistics. Go back to the Visualization tab. then investigate all the different options 7. by clicking Add to Dashboard button 104 . you can easily add to the dashboard.Add reports to a dashboard Once you have created your reports. Make sure to pick make interesting fields to be selected fields 105 . simply select the Statistics tab and click on the Pivot Icon Let's take a walkthrough: 1.Create an instant pivot from a search From any search. Click the Statistics tab after you have the search you want: 3. Then click the Pivot Icon 106 .2. Then you can choose the fields you have selected to Pivot.4. and click OK : 107 . 5. Then you can choose a field like annual_inc with a default of Sum to be part of your Pivot column values: 108 . And then pick a field like addr_state to the row column 109 .6. 7. Finally pick a bar chart on the left side 110 . FL. Create an instant pivot out of the search from #1 above. Create a report out of LoanStats3a. NY 2.csv source that looks into the annual income < 70000 and the addr_state of CA .Hands on Lab 1. 111 . End of Module Hands on Quiz Please refer to your virtual machine for test 112 . End of Module Hands on Quiz 113 . Add a report to a dashboard Add a pivot report to a dashboard Edit a dashboard Hands on Lab covering: Add a pivot report to a dashboard.Module 6 .Working with Dashboards        Create a dashboard Add a report to a dashboard Hands on Lab covering: Create a dashboard. Edit a dashboard. Create a dashboard You can create a dashboard from the search OR you can click on the Dashboard option on the Toolbar OR 114 . Add a report to a dashboard Click on Add to Dashboard from your report 115 . 2. Flowers Dashboard Bonus Lab: The report out of LoanStats3a. NY from the last module and create a dashboard 116 .FL.Hands on Lab: Let's use the flower shop transactions to create a dashboard and add a report to it Before you learned about the fields in your data.csv source that looks into the annual income < 70000 and the addr_state of CA . 3. you might have run this search to see how many times flowers were purchased from the online shop: sourcetype=access_* purchase flower*| top limit=20 category_id 1. 4. Let's save the report of this search as Flowers Category Click on the view button to view the report Click Add to DashBoard to add report to Dashboard Name the Dashboard. Add a pivot report to a dashboard From your pivot . you can save as a dashboard panel 117 . you can edit your dashboard from the menu And then you could.Edit a dashboard From your dashboard. for example edit Panels 118 . NY 2. Create an instant pivot. Then add that pivot report to the dashboard 3.FL.FL. Create another instant pivot or report and add to the existing dashboard 119 . Bonus Lab: 1. Add that report to the dashboard created in exercise #1 5. NY 4. like the one from the previous module out of LoanStats3a.csv source that looks into the annual income < 70000 and the addr_state of CA . Create another report that looks at ALL the annual incomes in the states of CA.Hands on Lab: 1. Edit the dashboard panels and add titles to your panels. End of Module Hands on Quiz Please refer to your virtual machine for test 120 . Search Fundamentals  Review basic search commands and general search practices  Examine the anatomy of a search  Use the following commands to perform searches:  Fields  Table  Rename  Rex  Multikv 121 .Module 7 . and the reports you create will run faster for you and for others. mary error. we will discuss in this chapter under Boolean and grouping operators. we will cover the following topics:     How to write effective searches How to search using fields Understanding time Saving and sharing searches Using search terms effectively The key to creating an effective search is to take advantage of the index. ERROR. The following few key points should be committed to memory:     Search terms are case insensitive: Searches for error. sliced by time. Using the index efficiently will make your initial discoveries faster. There are Boolean and grouping operators to change this behavior. Search terms are words. Search terms are additive: Given the search item. including parts of words: A search for foo will also match foobar. which would always have a single index across all events in a table. The Splunk index is effectively a huge word index.Review basic search commands and general search practices To successfully use Splunk. only the buckets that contain events for the time frame in question need to be queried. it is vital that you write effective searches. Only the time frame specified is queried: This may seem obvious. only events that contain both words will be found. Since each index is sliced into new buckets over time. The single most important factor for the performance of your searches is how many events are pulled from the disk. but it's very different from a database. In this chapter. Error. 122 . and ErRoR are all the same thing. ip=1. but this is what Splunk's index is really. Using the index as it is designed is the best way to build fast searches. Field names are case sensitive: When searching for host=myhost.3]. Regular expressions can then be used to further filter results or extract fields. 31. 3. though:  A word is anything surrounded by whitespace or punctuation: For instance. Likewise. really good at—dealing with huge numbers of words across a huge number of events.With just these concepts. This     may seem strange. AuthClass. including the auto generated fields. Splunk is not grep with an interface: One of the most common questions is whether Splunk uses regular expressions for your searches. any extracted or configured fields have case sensitive field names.3. [user=Bobby.02. Numbers are not numbers until after they have been parsed at search time: This means that searching for foo>5 will not use the index. 07T01.2. 0600. Let's dig a little deeper. Splunk does use regex internally to extract fields. user. the answer is no. but in the vast majority of cases it is unnecessary and is actually wasteful. Hello. but most of what you would do with regular expressions is available in other ways. ip. and possibly a bit wasteful. the "words" indexed are 2012. and 3. 1. but the values are case insensitive. There are legitimate reasons to define indexed fields. Bobby. 03. 2. 104.  Host=myhost will not work  host=myhost will work  host=MyHost will work Fields do not have to be defined before indexing data: An indexed field is a field that is added to the metadata of an event at index time.104-0600 INFO AuthClass Hello world. you can write fairly effective searches. given the log line 2012-0207T01:03:31. host must be lowercase. There are different ways to deal with this behavior. 123 . world. Technically. INFO. as the value of foo is not known until it has been parsed out of the event at search time. depending on the question you're trying to answer. Examine the anatomy of a search Boolean and grouping operators There are a few operators that you can use to refine your searches (note that these operators must be in uppercase to not be considered search terms):      AND is implied between terms. The quote marks ("") identify a phrase. \= is the same as "=". error OR mary means find any event that contains either word. For example. Parentheses can help avoid confusion in logic. but not necessarily in that order. For instance. You can also escape characters to search for them. these two statements are equivalent:     bob error OR warn NOT debug bob AND (error OR warn)) AND NOT debug The equal sign (=) is reserved for specifying fields. Parentheses ( ( ) ) is used for grouping terms. For example. 124 . For instance. Brackets ( [ ] ) are used to perform a subsearch. error NOT mary would find events that contain error but do not contain mary. For instance. NOT applies to the next term or group. Searching for an equal sign can be accomplished by wrapping it in quotes. error mary (two words separated by a space) is the same as error AND mary. OR allows you to specify multiple values. Out of this world would find any event that contains all of these words. "Out of this world" will find this exact sequence of words. The following are a few examples:    error mary NOT jacky error NOT (mary warn) NOT (jacky error) index=myapplicationindex ( sourcetype=sourcetype1 AND ( (bob NOT error) OR (mary AND warn) ) ) OR ( sourcetype=sourcetype2 (jacky info) ) This can also be written with some whitespace for clarity: index=myapplicationindex ( sourcetype=security AND ( (bob NOT error) OR (mary AND warn) ) ) OR ( sourcetype=application (jacky info) ) 125 . or even to find multiple sets of events in a single query.You can use these operators in fairly complicated ways if you want to be very specific.  Clicking on any word or field value will give you the option to Add to search or Exclude from search (the existing search) or (create a) New search:  Clicking on a word or a field value that is already in the query will give you the option to remove it (from the existing query) or.Clicking to modify your search Though you can probably figure it out by just clicking around. (create a) new (search): 126 . as above. it is worth discussing the behavior of the GUI when moving your mouse around and clicking. it is not accessible through the web interface/options dialog in this version. to start a new search. For instance. the options dialog is not present – although segmentation (discussed later in this chapter under field widgets section) is still an important concept.2. will again give us an option to append (add to) or exclude (remove from) our search or. as before. or in the field value widgets underneath an event.log" to your search: 127 .log" appears under your event. Field widgets Clicking on values in the Select Fields dialog (the field picker). if source="C:\Test Data\TM1ProcessError_20140623213757_temp. In version 6. event segmentation was configurable through a setting in the Options dialog.Event segmentation In previous versions of Splunk. clicking on that value and selecting Add to search will append source="C:\\Test Data\\TM1ProcessError_20140623213757_temp. To use the field picker. you can click on the link All Fields (see the following image): Expand the results window by clicking on > in the far-left column. Clicking on a result will append that item to the current search: 128 . The latter will work for events that contain the exact quoted text. Depending on your event segmentation setting. it will simply search for the word.If a field value looks like key=value in the text of an event. but not for other events that actually contain the same field value extracted in a different way. instead. The former will not take advantage of the field definition. you will want to use one of the field widgets instead of clicking on the raw text of the event. 129 . clicking on the word will either add the value or key=value. hours. or plus or minus. minutes. or weeks: 130 . days. milliseconds. you can select Nearby Events within plus. and will also have the following choices:    Before this time After this time At this time In addition. minus. a number of seconds (the default).Time Clicking on the time next to an event will open the _time dialog (shown in the following image) allowing you to change the search to select Events Before or After a particular time period. and then use the Zoom out (above the timeline) until the appropriate time frame is reached. select At this time.One search trick is to click on the time of an event. 131 . such as timechart and chart. | fields . If neither is specified. <string>. Statistical commands. | fields . Important: The leading underscore is reserved for all internal Splunk Enterprise field names. Syntax fields [+|-] <wc-field-list> Required arguments <wc-field-list> Syntax: <string>. only the fields that match one of the fields in the list are removed. You can use wild card characters in the field names. By default.. 132 . cannot display date or time information without the _time field. The fields command does not remove internal fields unless explicitly specified with: .. with: .._* or more explicitly.Fields command Description Keeps (+) or removes (-) fields from search results based on the field list criteria._raw. If + is specified._time Note: Be cautious removing the _time field. If . defaults to +. only the fields that match one of the fields in the list are kept. Description: Comma-delimited list of fields to keep (+) or remove (-). .is specified. such as _raw and _time... internal fields _raw and _time are included in output.. ..host. ip | fields .Examples Example 1: Remove the "host" and "ip" fields. host. .._* Example 3: Keep only the fields 'source'. | fields host. . Remove all of the internal fields. | fields source. error* 133 . The internal fields begin with an underscore character.. . and all fields beginning with 'error'. 'sourcetype'... for example _time. sourcetype. 'host'. | fields . ip Example 2: Keep only the host and ip fields. . If you're going to rename a field.Table command Description The table command is similar to the fields command in that it lets you specify the fields you want to keep in your results.. Field renaming: The table command doesn't let you rename fields. and the table command strips these fields out of the results by default. You can use wild card characters in the field names. Command type: The table command is a non-streaming command. _*) to render the charts. The table command can be used to build a scatter plot to show trends in the relationships between discrete values of your data. Columns are displayed in the same order that fields are specified. Column headers are the field names. use the fields command. Each row represents an event. Use table command when you want to retain data in tabular format. Instead. Rows are the field values. only specify the fields that you want to show in your tabulated results. If you are looking for a streaming command similar to the table command. 134 . do it before piping the results to table. Description: A list of field names. Syntax table <wc-field-list> Arguments <wc-field-list> Syntax: <wc-field> <wc-field> . Usage The table command returns a table formed by only the fields specified in the arguments. you should not use it for charts (such as chart or timechart) because the UI requires the internal fields (which are the fields beginning with an underscore. Otherwise. you should use the fields command because it always retains all the internal fields. . such as "Product ID" instead of "pid". Use quotes to rename a field to a phrase: . | stats first(host) AS site.. | rename SESSIONID AS sessionID Use wildcards to rename multiple fields: .. 135 .. or non-present.. .. the renaming will carry over the wildcarded portions to the destination expression. Note: You cannot rename one field with multiple names. you can use wildcards. It overwrites product_id with Null values where pid does not exist for the event. | rename *ip AS *IPaddress If both the source and destination fields are wildcard expressions with the same number of wildcards. See Example 2. you cannot do "A as B. If you want to rename multiple fields. fields are brought along with the values..Rename command Description Use the rename command to rename a specified field or multiple fields. first(host) AS report Note: You cannot use this command to merge multiple fields into one field because null. if you had events with either product_id or pid fields. For example. For example if you had a field A.. | rename pid AS product_id would not merge the pid values into the product_id field. below.. A as C" in one string. This command is useful for giving fields more meaningful names. .. You can use wild card characters in the field names. 136 . Names with spaces must be enclosed in quotation marks.Syntax rename <wc-field> AS <wc-field>. Required arguments wc-field Syntax: <string> Description: The name of a field and the name to replace it. If a field is not specified. When mode=sed. or replace or substitute characters in a field using sed expressions. The rex command matches the value of the specified field against the unanchored regular expression and extracts the named groups into fields of the corresponding names. sed-expression Syntax: "<string>" Description: When mode=sed. This sed-syntax is also used to mask sensitive data at index-time. the given sed expression used to replace or substitute characters is applied to the value of the chosen field. the sed expression is applied to _raw.Rex command Description Use this command to either extract fields using regular expression named groups. mode Syntax: mode=sed Description: Specify to indicate that you are using a sed (UNIX stream editor) expression. Use the rex command for search-time field extraction or string replacement and character substitution. the regular expression is applied to the _raw field. Quotation marks are required. Syntax rex [field=<field>] ( <regex-expression> [max_match=<int>] [offset_field=<string>] ) | (mode=sed <sed-expression>) Required arguments regex-expression Syntax: "<string>" Description: The PCRE regular expression that defines the information to match and extract from the specified field. Note: Running rex against the _raw field might have a performance impact. specify whether to replace strings (s) or substitute characters (y) in the 137 . If a field is not specified. or a number to replace a specified match. This value of the field has the endpoints of the match in terms of zero-offset characters into the matched field. If greater than 1. which can include capturing groups. No other sed commands are implemented. 138 . <flags> can be either: g to replace all matches. a field is created with the name specified by <string>. this matches the first ten characters of the field. and the offset_field contents is "0-9". Use \n for backreferences. where N is a number that is the character location in the string.matching regular expression. Optional arguments field Syntax: field=<field> Description: The field that you want to extract information from. The syntax for using sed to replace (s) text in your data is: "s/<regex>/<replacement>/<flags>"    <regex> is a PCRE regular expression. Sed mode supports the following flags: global (g) and Nth occurrence (N). Default: unset Sed expression When using the rex command in sed mode. where "n" is a single digit. you have two options: replace (s) or character substitution (y). <replacement> is a string to replace the regex match. Quotation marks are required. the resulting fields are multivalued fields. Default: _raw max_match Syntax: max_match=<int> Description: Controls the number of times the regex is matched. offset_field Syntax: offset_field=<string> Description: If provided. For example. Default: 1.{10})". if the rex expression is "(?<tenchars>. use 0 to mean unlimited. *) To: (?<to>.*)" Example 2: Extract "user".my_saved_search then user=bob ..search..(?<SavedSearchName>\w+)" 139 .The syntax for using sed to substitute characters is: "y/<string1>/<string2>/"  This substitutes the characters that match <string1> with the characters in <string2>. "app" and "SavedSearchName" from a field called "savedsearch_id" in scheduler. | rex field=_raw "From: (?<from>.log events..(?<app>\w+). Examples Example 1: Extract "from" and "to" fields using regular expressions. If savedsearch_id=bob. | rex field=savedsearch_id "(?<user>\w+). app=search and SavedSearchName=my_saved_search . then from=Susan and to=Bob. When you use regular expressions in searches. Usage Splunk Enterprise uses perl-compatible regular expressions (PCRE). If a raw event contains "From: Susan To: Bob".. you need to be aware of how characters such as pipe ( | ) and backslash ( \ ) are handled. . . | rex field=ccnumber mode=sed "s/(\d{4}-){3}/XXXX-XXXX-XXXX-/g" Example 4: Display IP address and ports of potential attackers. sourcetype=linux_secure port "failed password" | rex "\s+(?<ports>port \d+)" | top src_ip ports showperc=0 This search used rex to extract the port field and values. 140 . it displays a table of the top source IP addresses (src_ip) and ports the returned with the search for potential attackers. ..Example 3: Use sed syntax to match the regex to a series of numbers and replace them with an anonymized string. Then. It works more easily with the fixedalignment though can sometimes handle merely ordered fields. 141 . The columns are aligned. offsets. but may require ensuring that the secondary tables have capitalized or ALLCAPS names in a header row. The multikv command creates a new event for each table row and assigns field names from the title row of the table. multikv can transform this table from one event into three events with the relevant fields. and field counts. and so on. such as the results of top. The first line of text provides the names for the data in the columns.Multikv command Description Extracts field-values from table-formatted events. Auto-detection of header rows favors rows that are text. The general strategy is to identify a header. Multiple tables in a single event can be handled (if multitable=true). netstat. ps. and are ALLCAPS or Capitalized. and then determine which components of subsequent lines should be included into those field names. An example of the type of data multikv is designed to handle: Name Josh Francine Samantha Age 42 35 22 Occupation SoftwareEngineer CEO ProjectManager The key properties here are:    Each line of text represents a conceptual record. Quoted expressions are permitted. When false.conf. no fields are copied from the original event. This means that the events will have no _time field and the UI will not know how to display them.. <multikv-option> Syntax: copyattrs=<bool> | fields <field-list> | filter <field-list> | forceheader=<int> | multitable=<bool> | noheader=<bool> | rmorig=<bool> Description: Options for extracting fields from tabular events. such as "multiple words" or "trailing_space ". Default: true fields Syntax: fields <field-list> Description: Limit the fields set by the multikv extraction to this list.Syntax multikv [conf=<stanza_name>] [<multikv-option>.] Optional arguments conf Syntax: conf=<stanza_name> Description: If you have a field extraction defined in multikv. forceheader 142 . multikv skips over table rows that do not contain at least one of the strings in the filter list. Ignores any fields in the table which are not on this list.. Descriptions for multikv options copyattrs Syntax: copyattrs=<bool> Description: When true. multikv copies all fields from the original event to the events generated from that event. use this argument to reference the stanza in your search. filter Syntax: filter <term-list> Description: If specified. Default: true Examples Example 1: Extract the "COMMAND" field when it occurs in rows that contain "splunkd". and fields will be named Column_1. . Column_2. Default: The multikv command attempts to determine the header line automatically.. Default: true noheader Syntax: noheader=<bool> Description: Handle a table without header row identification.. The size of the table will be inferred from the first row. Does not include empty lines in the count. the original events are retained in the output results. with each original emitted after the batch of generated results from that original. | multikv fields pid command 143 . the original events will not be included in the output results. multitable Syntax: multitable=<bool> Description: Controls whether or not there can be multiple tables in a single _raw in the original events. noheader=true implies multitable=false.Syntax: forceheader=<int> Description: Forces the use of the given line number (1 based) as the table's header. ... . When false. | multikv fields COMMAND filter splunkd Example 2: Extract the "pid" and "command" fields... Default: false rmorig Syntax: rmorig=<bool> Description: When true. csv and use the rename command to rename fields in #1 4. Use the source LoanStats3a. Use the source LoanStats3a.*)" b. and then click on all_util field to demonstrate the rex results 144 .csv and the table command on the same fields in #1 3. source="LoanStats3a.Hands-on Lab 1.csv and use the rex command for: a. Use the source LoanStats3a.csv" annual_inc=60000 | rex "Does not meet the credit policy.csv and only take a look at some fields out of the data 2.(?<all_util>. Use the source LoanStats3a. End of Module Quiz Please refer to your virtual machine for test 145 . Add Coltotals  End of Module Hands on Quiz 146 . Rare  Stats  Add coltotals  Hands on Lab covering: Stats.Module 8 . Part 1  Use the following commands and their functions:  Top  Rare  Hands on Lab covering: Top.Reporting Commands. .. the command finds the most frequent values for each distinct tuple of values of the group-by fields..Top command Description Displays the most common values of a field. Finds the most frequent tuple of values of all fields in the field list. <by-clause> 147 ..] <field-list> [<by-clause>] Required arguments <field-list> Syntax: <field>. along with a count and percentage. <field>. Syntax top [<N>] [<top-options>. . Description: Comma-delimited list of field names. <top-options> Syntax: countfield=<string> | limit=<int> | otherstr=<string> | percentfield=<string> | showcount=<bool> | showperc=<bool> | useother=<bool> Description: Options for the top command. If the optional by-clause is included. Optional arguments <N> Syntax: <int> Description: The number of results to return. See Top options. specify the value that is written into the row representing all other values. Default: true showperc Syntax: showperc=<bool> Description: Specify whether to create a field called "percent" (see "percentfield" option) with the relative prevalence of that tuple. Default: "percent" showcount Syntax: showcount=<bool> Description: Specify whether to create a field called "count" (see "countfield" option) with the count of that tuple. Default: "10" otherstr Syntax: otherstr=<string> Description: If useother is true.Syntax: BY <field-list> Description: The name of one or more fields to group by. Top options countfield Syntax: countfield=<string> Description: The name of a new field that the value of count is written to. Default: "count" limit Syntax: limit=<int> Description: Specifies how many tuples to return. Default: "OTHER" percentfield Syntax: percentfield=<string> Description: Name of a new field to write the value of percentage. Default: true 148 . "0" returns all values. Default: false Examples Example 1: Return the 20 most common values of the "referer" field. sourcetype=access_* | top limit=20 referer Example 2: Return top "action" values for each "referer_domain". 149 .useother Syntax: useother=<bool> Description: Specify whether or not to add a row that represents all values not included due to the limit cutoff. this returns all the combinations of values for "action" and "referer_domain" as well as the counts and percentages 150 .sourcetype=access_* | top action by referer_domain Because a limit is not specified. sourcetype=access_* status=200 action=purchase | top 1 productName by categoryId showperc=f countfield=total 151 .Example 3: Return the top product purchased for each category. Do not show the percent field. Rename the count field to "total". If the <by-clause> is specified. Finds the least frequent tuple of values of all fields in the field list. Optional arguments <top-options> Syntax: countfield=<string> | limit=<int> | percentfield=<string> | showcount=<bool> | showperc=<bool> Description: Options that specify the type and number of values to display. These are the same <topoptions> used by the top command. except that the rare command finds the least frequent instead of the most frequent. <by-clause> Syntax: BY <field-list> Description: The name of one or more fields to group by.Rare command Description Displays the least common values of a field. this command returns rare tuples of values for each distinct tuple of values of the group-by fields.. 152 . This command operates identically to the top command..... Description: Comma-delimited list of field names. Syntax rare [<top-options>.] <field-list> [<by-clause>] Required arguments <field-list> Syntax: <string>. Default: true 153 . Specifying a value larger than maxresultrows produces an error. Default: "count" limit Syntax: limit=<int> Description: Specifies how many tuples to return.Top options countfield Syntax: countfield=<string> Description: The name of a new field to write the value of count into. See Limits section. Default: 10 percentfield Syntax: percentfield=<string> Description: Name of a new field to write the value of percentage. If you specify >code>limit=0</code>. Default: "percent" showcount Syntax: showcount=<bool> Description: Specify whether to create a field called "count" (see "countfield" option) with the count of that tuple. Default: true showperc Syntax: showperc=<bool> Description: Specify whether to create a field called "percent" (see "percentfield" option) with the relative prevalence of that tuple. all values up to maxresultrows are returned. in the [rare] stanza. maxresultrows. By default this limit is 10. and effectively keeps a ceiling on the memory that rare will use. but other values can be selected with the limit option up to a further constraint expressed in limits.. This ceiling is 50. Examples Example 1: Return the least common values of the "url" field..conf. | rare url Example 2: Find the least common "user" value for a "host". .Limits There is a limit on the number of results which rare returns.. | rare user by host 154 . .000 by default.. Rare 1. Run source="C:\\LoanStats3a. show the rare addr_state 2. Run another search on your own demonstrating your use of the top and rare functions 155 .csv"| top limit=20 addr_state Now .Hands on Lab covering: Top. . This is similar to SQL aggregation.. For more information on eval expressions. You can use wild card characters in the field name. count. or to a field or set of fields.. | <sparkline-agg-term>. The function can be applied to an eval expression. sparkline-agg-term Syntax: <sparkline-agg> [AS <wc-field>] Description: A sparkline aggregation function. ) [<byclause>] Required arguments stats-agg-term Syntax: <stats-function>(<evaled-field> | <wc-field>) [AS <wc-field>] Description: A statistical aggregation function. such as average. Use the AS clause to place the result into a new field with a name that you specify. Syntax Simple: stats (stats-function(field) [AS field]). and sum.Stats command Description Calculates aggregate statistics over the results set. which is the aggregation over the entire incoming result set. You can use wild card characters in field names.. computes numerical statistics on each field if and only if all of the values of that field 156 . Optional arguments allnum syntax: allnum=<bool> Description: If true... If stats is used without a by clause only one row is returned. Use the AS clause to place the result into a new field with a name that you specify. If you use a by clause one row is returned for each distinct value specified in the by clause. see Types of eval expressions in the Search Manual. [BY field-list] Complete: stats [partitions=<num>] [allnum=<bool>] [delim=<string>] ( <stats-agg-term>. Each time you invoke the stats command. you cannot specify | stats count BY source*. Usage The stats command does not support wildcard characters in field values in BY clauses. For example. Default: 1 Stats function options stats-function Syntax: avg() | c() | count() | dc() | distinct_count() | earliest() | estdc() | estdc_error() | exactperc<int>() | first() | last() | latest() | list() | max() | median() | min() | mode() | p<in>() | perc<int>() | range() | stdev() | stdevp() | sum() | sumsq() | upperperc<int>() | values() | var() | varp() Description: Functions used with the stats command. You cannot use a wildcard character to specify multiple fields with similar names. Default: false delim Syntax: delim=<string> Description: Specifies how the values in the list() or values() aggregation are delimited. You must specify each field separately. Default: a single space by-clause Syntax: BY <field-list> Description: The name of one or more fields to group by. you can use more than one function. However. you can only use one by clause.are numerical. 157 . partitions the input data based on the split-by fields for multithreaded reduce. partitions Syntax: partitions=<num> Description: If specified. . Calculate the average time for each hour for similar fields using wildcard characters Return the average. Remove duplicates in the result set and return the total count for the unique results Remove duplicates of results with the same "host" value and return the total count of the remaining results. relay. | stats avg(*lay) BY date_hour 4.. Search the access logs. For example. The "top" command returns a count and percent value for each "referer_domain". delay. etc. of any unique field that ends with the string "lay". sourcetype=access_combined | top limit=100 referer_domain | stats sum(count) AS total 3. for each hour.Basic Examples 1. and return the total number of hits from the top 100 values of "referer_domain" Search the access logs. xdelay. Return the average transfer rate for each host sourcetype=access* | stats avg(kbps) by host 2... . | stats dc(host) 158 . . and return the total number of hits from the top 100 values of "referer_domain". . Default: Calculates the sum for all of the fields. Default: Total 159 . the label argument has no effect. labelfield Syntax: labelfield=<fieldname> Description: Specify a field name to add to the result set.. Results are displayed on the Statistics tab.Addcoltotals command Description The addcoltotals command appends a new result to the end of the search result set. If the labelfield argument is absent. The addcoltotals command calculates the sum only for the fields in the list you specify. a column is added to the statistical results table with the name specified. Default: none label Syntax: label=<string> Description: Used with the labelfield argument to add a label in the summary event. Description: A space delimited list of valid field names. The result contains the sum of each numeric field or you can specify which fields to summarize. Syntax addcoltotals [labelfield=<field>] [label=<string>] [<fieldlist>] Optional arguments <fieldlist> Syntax: <field> . You can use the asterisk ( * ) as a wildcard in the field names. If the labelfield argument is specified. . sourcetype=access_* | table userId bytes avgTime duration | addcoltotals bytes duration Example 3: Filter fields for two name-patterns.. | fields user*.log" group=pipeline |stats avg(cpu_seconds) by processor |addcoltotals labelfield=processor 160 .Examples Example 1: Compute the sums of all the fields. and put the sums in a summary event called "change_name". | addcoltotals labelfield=change_name label=ALL Example 2: Add a column total for two specific fields in a table. index=_internal source=*metrics. and get totals for one of them.. *size | addcoltotals *size Example 4: Augment a chart with a total of the values present.. .. source="C:\\LoanStats3a. Run a search query that uses the top.csv"| top limit=20 addr_state | stats count 2.Hands on Lab 1.  sourcetype=access_* | table userId bytes avgTime duration | addcoltotals bytes duration Come up with your own example for the Loans file 161 .csv" addr_state=CA | stats count 3. Try running: 4. stats functions with Loan file to get the count: source="C:\\LoanStats3a. End of Module Hands on Quiz Please refer to your virtual machine for test details 162 . Create a basic chart. Explain when to use each type of reporting command.Module 9 . Chart multiple values on the same timeline Format charts Explain when to use each type of reporting command Hands on Lab covering: Format Charts. Create a time chart. Part 2             Explore the available visualizations Create a basic chart Split values into multiple series Hands on Lab covering: Explore the available visualizations. Split values into multiple series Omit null and other values from charts Create a time chart Chart multiple values on the same timeline Hands on Lab covering: Omit null and other values from charts. End of Module hands on Quiz 163 .Reporting Commands. You can access these tools from various places in Splunk Web. select the Visualization tab.      Search Dashboards Dashboard visual editor Pivot Reports Visualizations from Splunk Search You can modify how Splunk displays search results in the Search page. To create a dashboard panel from search results. Dashboard panel visualizations When you base a new dashboard panel on search results. 164 . The search must be a reporting search that returns results that can be formatted as a visualization. then select the type of visualization to display and specify formatting options for the selected visualization. After running a search. you can choose the visualization that best represents the data returned by the search. You can then use the Visualization Editor to fine-tune the way the panel visualization displays.Explore the available visualizations Accessing Splunk's visualization definition features Splunk provides user interface tools to create and modify visualizations. after you run the search click Save As > Dashboard Panel. timechart. The geostats command is similar to the stats command. and pie charts. You can get events visualizations from any search that does not include a transform operation. timechart. such as a search that uses reporting commands like stats. Tables You can pick table visualizations from just about any search. scatter. Events generated from the geostats command include latitude and longitude coordinates for markers. such as column. area. top. line. Maps Splunk provides a map visualization that lets you plot geographic coordinates as interactive markers on a world map. These visualizations require transforming searches (searches that use reporting commands) whose results involve one or more series. but provides options for zoom levels and cells for mapping. chart. or rare. or rare. Charts Splunk provides a variety of chart visualizations. Searches for map visualizations should use the geostats search command to plot markers on a map. 165 . such as a search that uses reporting commands like stats. top. but the most interesting tables are generated by searches that include transform operations.Events visualizations Events visualizations are essentially raw lists of events. chart. These visualizations require transforming searches (searches that use reporting commands) whose results involve one or more series. Every column in the table after the first one represents a different series.Create a basic chart Charts Splunk provides a variety of chart visualizations. or you can set them up so the results provide data for multiple series. except that the x-axis and y-axis values are reversed. and pie charts. if your search produces multiple series. and pie chart visualizations are usually best for such searches. scatter. area. or scatter chart visualization. such as column. each line plotted on a line chart represents an individual series. count of values. you'll want to go with a bar. Column and bar charts Use a column chart or bar chart to compare the frequency of values of fields in your data. In fact. line. pie charts can only display data from single series searches. though you'll find that bar. especially if your search uses the timechart reporting command) and the y-axis can be any other field value. It may help to think of the tables that can be generated by transforming searches. 166 . In a column chart. the x-axis values are typically field values (or time. Bar charts are exactly the same. area. For example. or statistical calculation of a field value. On the other hand. A "single series" search would produce a table with only two columns. line. while a "multiple series" search would produce a table with three or more columns. A series is a sequence of related data points that can be plotted on a chart. line. All of the chart visualizations can handle single-series searches. You can design transforming searches that produce a single series. column. column. When you define the properties of your bar and column charts. you can:   set the chart titles. if all the y-axis values of your search are above 100 it may improve clarity to have the chart start at 100). It finds the total sum of CPU_seconds by processor in the last 15 minutes. 167 . as well as the titles of the x-axis and y-axis. set the minimum y-axis values for the y-axis (for example. which uses internal Splunk metrics. and then arranges the processors with the top ten sums in descending order: index=_internal "group=pipeline" | stats sum(cpu_seconds) as totalCPUSeconds by processor | sort 10 totalCPUSeconds desc Note that in this example.The following bar chart presents the results of this search. we've also demonstrated how you can roll over a single bar or column to get detail information about it. . and unstacked. 168 . all of the series columns for a single datapoint (such as a specific month in the chart described in the preceding paragraph) are stacked to become segments of a single column (one column per month. turn their drilldown functionality on or off. Stacked column and bar charts When your base search involves more than one data series. you can use stacked column charts and stacked bar charts to compare the frequency of field values in your data. or 45. confusing chart. the columns for different series are placed alongside each other.. This may be fine if your chart is relatively simple--total counts of sales by month for two or three items in a store over the course of a year. In a column chart set to a Stack mode of Stacked. In an unstacked column chart. or 20. Note: You use a stacked column or bar chart to highlight the relative weight (importance) of the different types of data that make up a specific dataset.  set the unit scale to Log (logarithmic) to improve clarity of charts where you have a mix of very small and very large y-axis values. Bar and column charts are always unstacked by default. See the following subsection for details on stacking bar and column charts. for example--but when the series count increases it can make for a cluttered. determine the position of the chart legend and the manner in which the legend labels are truncated. you can arrange to have tick marks appear in units of 10. 100% stacked. determine whether charts are stacked. If you are formatting bar or column charts in dashboards with the Visualization Editor you can additionally:    set the major unit for the y-axis (for example. The total value of the column is the sum of the segments. to reference that example again).whatever works best). broken out by product category over a 7 day period: Here's the search that built that stacked chart: 169 .The following chart illustrates the customer views of pages in the website of MyFlowerShop. a hypothetical web-based flower store. The third Stack mode option. If your chart includes more than one series. enables you to compare data distributions within a column or bar by making it fit to 100% of the length or width of the chart and then presenting its segments in terms of their proportion of the total "100%" of the column or bar. Stacked 100% can help you to better see data distributions between segments in a column or bar chart that contains a mix of very small and very large stacks when Stack mode is just set to Stacked. events without one (categorized as null by Splunk) are excluded. it ensures that the chart only displays counts of events with a product category ID.sourcetype=access_* method=GET | timechart count by categoryId | fields _time BOUQUETS FLOWERS GIFTS SURPRISE TEDDY Note the usage of the fields command. each series will be represented by a differently colored line or area. Line and area charts Line and area charts are commonly used to show data trends over time. Stacked 100%. though the x-axis can be set to any field value. This chart is based on a simple search that reports on internal Splunk metrics: index=_internal | timechart count by sourcetype 170 . The shaded areas in area charts can help to emphasize quantities.log group=search_concurrency "system total" NOT user=* | timechart max(active_hist_searches) as "Historical Searches" max(active_realtime_searches) as "Real-time Searches" 171 . which also makes use of internal Splunk metrics: index=_internal source=*metrics. The following area chart is derived from this search. You can have the system leave gaps for null datapoints. Splunk will display markers for datapoints that are disconnected because they are not adjacent to other positive datapoints. Bar and column charts are always unstacked by default. determine what Splunk does with missing (null) y-axis values. or just connect to the next positive datapoint. and unstacked. If you choose to leave gaps. 100% stacked. set the minimum y-axis values (for example. you can:      set the chart titles. 172 . determine whether charts are stacked. See the following subsection for details on stacking bar and column charts. if all the y-axis values of your search are above 100 it may improve clarity to have the chart start at 100). as well as the titles of the x-axis and y-axis. have connect to zero datapoints.When you define the properties of your line and area charts. set the unit scale to Log (logarithmic) to improve clarity of charts where you have a mix of very small and very large y-axis values. . or 45. it makes it easier to see how each data series relates to the entire set of data as a whole.If you are formatting line or area charts in dashboards with the Visualization Editor you can additionally:    set the major unit for the y-axis (for example. Stacked line and area charts can help readers when several series are involved. determine the position of the chart legend and the manner in which the legend labels are truncated.whatever works best). turn their drilldown functionality on or off. The following chart is another example of a chart that presents information from internal Splunk metrics.. you can arrange to have tick marks appear in units of 10. or 20. Stacked line and area charts Stacked line and area charts operate along the same principles of stacked column and row charts (see above). The search used to create it is: index=_internal per_sourcetype_thruput | timechart sum(kb) by series useother=f 173 . Pie chart Use a pie chart to show the relationship of parts of your data to the entire set of data as a whole. Note that you can get metrics for individual pie chart wedges by mousing over them. The following pie chart presents the views by referrer domain for a hypothetical online store for the previous day. The size of a slice in a pie graph is determined by the size of a value of part of your data as a percentage of the total of all values. 174 . turn pie chart drilldown functionality on or off.When you define the properties of pie charts you can set the chart title. If you are formatting pie charts in dashboards with the Visualization Editor you can additionally:   determine the position of the chart legend. 175 . and then color-codes them by region. Here's an example of a search that can be used to generate a scatter chart. None of the quakes exceeded a magnitude of 4.0. which usually plots a regular series of points.Scatter chart Use a scatter chart ( or "scatter plot") to show trends in the relationships between discrete values of your data. plots out the quakes by magnitude and quake depth.5+ quakes recorded over a given 7-day period. with the exception of one quake that was around 27 meters deep. This is different from a line graph. pulls out just the Californian quakes. worldwide). a scatter plot shows discrete values that do not occur at regular intervals or belong to a series. Generally. It looks at USGS earthquake data (in this case a CSV file that presents all magnitude 2. 176 . As you can see the majority of quakes recorded during this period were fairly shallow--10 or fewer meters in depth. you can:    set the chart titles. The first field is what appears in the legend (Region).. or 45. determine the position of the chart legend and the manner in which the legend labels are truncated. we've used the table command. you can arrange to have tick marks appear in units of 10. followed by three fields. Note that when you use table the latter two fields must be numeric in nature. set the minimum y-axis values for the y-axis (for example. When you define the properties of your scatter charts. turn their drilldown functionality on or off. as well as the titles of the x-axis and y-axis. or 20. which leaves the third field (Depth) to be the y-axis value. The second field is the x-axis value (Magnitude). if all the y-axis values of your search are above 100 it may improve clarity to have the chart start at 100). If you are formatting bar or column charts in dashboards with the Visualization Editor you can additionally:    set the major unit for the y-axis (for example. set the unit scale to Log (logarithmic) to improve clarity of charts where you have a mix of very small and very large y-axis values.whatever works best). but the field names and format will be slightly different from the example shown here. source=usgs Region=*California | table Region Magnitude Depth | sort Region You can download a current CSV file from the USGS Earthquake Feeds and add it as an input to Splunk.To generate the chart for this example. 177 .. count(eval(method="POST")) AS POST Then click the visualization tab to see the result of this having two series.Split values into multiple series Run for example: sourcetype=access_* | timechart count(eval(method="GET")) AS GET. Make sure to select Line Chart 178 . we can see these results in a chart: 179 . Upload a data file called: ImplementingSplunkDataGenerator. Switching the fields (by rearranging our search statement a bit) turns the data the other way. chart generates the intersection of the two fields.tgz:*" host="WIN-SQM8ERRKEIJ"| chart count over date_month by date_wday If you look back at the results from stats. You can specify multiple functions. By simply clicking on the Visualization tab (to the right of the Statistics tab). but you may only specify one field each for over and by.tgz located on the desktop Run: source="ImplementingSplunkDataGenerator. the data is presented as one row per combination. Instead of a row per combination.Hands on Lab : 1. with particular format options set.This is an Area chart. Bar. Within the chart area. you can click on Area to change the chart type (Line. Column. Area. Multi-series Mode. Bonus Lab: Create a chart from the Loan file csv on your desktop 180 . and Drilldown). Null Values. and so on) or Format to change the format options (Stack. If your records don't have a unique Id field. If your records have a unique Id field. then the following snippet removes null fields: | stats values(*) as * by Id The reason is that "stats values won't show fields that don't have at least one non-null value". then you should create one first using streamstats: | streamstats count as Id | stats values(*) as * by Id 181 .Omit null and other values from charts Sometimes Splunk has extra null fields floating around. broken out by processor: index=_internal "group=thruput" | timechart avg(instantaneous_eps) by processor 182 . area. or column charts. Use timechart to display statistical trends over time.Create a time chart The timechart command The timechart command generates a table of summary statistics which can then be formatted as a chart visualization where your data is plotted against an x-axis that is always a time field. with the option of splitting the data with another field as a separate series in the chart. Examples Example 1: This report uses internal Splunk log data to visualize the average indexing thruput (indexing kbps) of Splunk processes over time. Timechart visualizations are usually line. Chart multiple values on the same timeline Refer to lesson above for multiple series: Run for example: sourcetype=access_* | timechart count(eval(method="GET")) AS GET. Make sure to select Line Chart 183 . count(eval(method="POST")) AS POST Then click the visualization tab to see the result of this having two series. Hands on Lab Run sourcetype=access_* | timechart count(eval(method="GET")) AS GET. count(eval(method="POST")) AS POST Create another example of a timechart with the Loan csv file 184 . useful for minimizing the display of large peak values. These options are grouped as:      General: Under general.  Scale: Select Inherit. you can set Position (where to place the legend (or to not include the legend) in the visualization. and set the rotation of the text for your chart labels. or Log. Chart Overlay: Here you can set the following options:  Overlay: Select a field to show as an overlay. and the min and max values. under Legend.  Max Value: The maximum value to display. determine how to handle Null Values (you can leave gaps for null data points.  Interval: Enter the units between tick marks in the axis. Inherit uses the scale for the base chart. the interval. X-Axis: Is mostly visual. depending on your search results and the visualization options that you select. you can set a custom title. Linear.Format charts Let's go ahead and take a look at the (chart) Format options.  Title: Specify a title for the overlay. 185 . Keep in mind that. set the Multi-series mode (Yes or No). Log provides a logarithmic scale. you may or may not get a useable result.  Min Value: The minimum value to display. you have the option to set the Stack Model (which indicates how Splunk will display your chart columns for different series (alongside each other or as a single column). Legend: Finally. Values less than the Min Value do not appear on the chart. connect to zero data points.  View as Axis: Select On to map the overlay to a second Y-axis. and turn Drilldown (active or inactive) on or off.) and Truncation (set how to represent names that are too long to display). Y-Axis: Here you can set not just a custom title. Values greater than the Max Value do not appear on the chart. allow truncation of label captions. Some experimentation with the various options is recommended. or just connect to the next positive data point). but also the scale (linear or log). 186 . eventstats. correlate. which means that _time is always the x-axis. timechart: used to create "trend over time" reports. stats. rare: creates charts that display the least common values of a field. correlations. You can decide what field is tracked on the x-axis of the chart. and differences between fields in your data. The primary reporting commands are:       chart: used to create charts that can display any series of data that you want to plot. top: generates charts that display the most common values of a field. and diff: create reports that enable you to see associations.Explain when to use each type of reporting command A reporting command primer This subsection covers the major categories of reporting commands and provides examples of how they can be used in a search. and streamstats: generate reports that display summary statistics. associate. median. percentiles standard deviation. range. eventstats. The list of available statistical functions includes:       count. stats. chart. distinct count mean. max. linking them with a pipe operator ("|"). mode min. and streamstats are all designed to work in conjunction with statistical functions. last occurrence 187 . you always place your reporting commands after your search commands. timechart. variance sum first occurrence.Note: As you'll see in the following examples. Hands on Lab Please format your chart from the last lab exercise 188 . End of Module hands on Quiz 189 . Perform calculations. Round values Format values Hands on Lab covering: Round values. Calculating. Convert values. Further filter calculated results End of Module Hands on Quiz 190 .Module 10 . Format values Use conditional statements Further filter calculated results Hands on Lab covering: Use conditional statements.Analyzing. and Formatting Results            Using the eval command Perform calculations Convert values Hands on Lab covering: Using the eval command. A and B. a string concatenation. the radii are radius_a and radius_b.. arithmetic operations may not produce valid results if the values are not numerical.. For addition. Example 1: Use eval to define a field that is the sum of the areas of two circles. For circles A and B. regardless of their actual type. where r is the radius. operators. a boolean expression. . 2) + pi() * pow(radius_b. 2) The area of circle is πr^2.'. they often can be quite complex. But while some eval expressions are relatively simple. This eval expression uses the pi and pow functions to calculate the area of each circle and then adds them together. with the exception of addition. Types of eval expressions An eval expression is a combination of literals. and saves the result in a field named. Eval expressions require that the field's values are valid for the type of operation. respectively. fields. The eval command is immensely versatile and useful.Using the eval command and perform calculations Use the eval command and functions The eval command enables you to devise arbitrary expressions that use automatically extracted fields to create a new field that takes the value that is the result of the expression's evaluation. | eval sum_of_areas = pi() * pow(radius_a. eval can concatenate the two operands if they are both strings. eval treats both values as strings. a comparison expression. and functions that represent the value of your destination field. When concatenating values with '. or a call to one of the eval functions. For example. sum_of_areas. 191 . The expression can involve a mathematical operation. state This eval expression is a simple string concatenation. PA".Example 2: Use eval to define a location field using the city and state fields. if the city=Philadelphia and state=PA. For example.".. ". 192 . location="Philadelphia. . | eval location=city.. delay and xdelay. the time expression is prefixed with the number of days and a plus character (D+). Change the sendmail duration format of delay and xdelay to seconds. The delay is expressed as "D+HH:MM:SS". the original values are replaced by the new values. The xdelay is the total amount of time the message took to be transmitted during final delivery. into seconds. Unless you use the AS clause. 193 . delay and xdelay. minutes (MM). and seconds (SS) to handle delivery or rejection of the message. and its time is expressed as "HH:MM:SS".Convert values The convert command converts field values into numerical values. sourcetype=sendmail | convert dur2sec(delay) dur2sec(xdelay) This search pipes all the sendmail events into the convert command and uses the dur2sec() function to convert the duration times of the fields. If the delay exceeds 24 hours. The delay is the total amount of time a message took to deliver or bounce. which indicates the time it took in hours (HH). The sendmail logs have two duration fields. Example 1 This example uses sendmail email server logs and refers to the logs with sourcetype=sendmail. minutes. The timeformat="%H:%M:%S" arguments tells the search to format the _time value as HH:MM:SS. which is renamed c_time: 194 .Here is how your search results look after you use the fields sidebar to add the fields to your events: You can compare the converted field values to the original field values in the events list. c_time The ctime() function converts the _time value of syslog (sourcetype=syslog) events to the format specified by the timeformat argument. the table command is used to show the original _time value and the converted time. sourcetype=syslog | convert timeformat="%H:%M:%S" ctime(_time) AS c_time | table _time. Example 2 This example uses syslog data. Convert a UNIX epoch time to a more readable time formatted to show hours. and seconds. Here. Convert a time in MM:SS. ms_time The mstime() function converts the _time value of syslog (sourcetype=syslog) events from a minutes and seconds to just seconds. seconds. and subseconds) to a number in seconds.The ctime() function changes the timestamp to a non-numerical value. sourcetype=syslog | convert mstime(_time) AS ms_time | table _time.SSS (minutes. This is useful for display in a report or for readability in your events list. Example 3 This example uses syslog data. 195 . Here. This is useful if you want to use it for more calculations. if "duration="212 sec"". | convert rmunit(duration) Example 2: Change the sendmail syslog duration format (D+HH:MM:SS) to seconds. For example. the resulting value is "duration="212"".. More examples Example 1: Convert values of the "duration" field into number value by removing string values in the field value. the table command is used to show the original _time value and the converted time. the resulting value is "delay="615"". 196 . For example.. if "delay="00:10:15"". . which is renamed ms_time: The mstime() function changes the timestamp to a numerical value. | convert auto(*) none(foo) 197 .. .. .... | convert dur2sec(delay) Example 3: Change all memory values in the "virt" field to Kilobytes. | convert memk(virt) Example 4: Convert every field value to a number value except for values in the field "foo" Use the "none" argument to specify fields to ignore... Hands on Lab 1. Take the Loan csv file and develop some eval functions 198 .tgz:*" host="WIN-SQM8ERRKEIJ" error | stats count by logger user | eventstats sum(count) as totalcount | eval percent=count/totalcount*100 | sort -count And explain the options 2. Run: source="ImplementingSplunkDataGenerator. The function uses two string arguments: the first is the CIDR subnet.. or "not local" if it does not: .. Comparison and Conditional functions Function case(X. "Internal Server Error". isLocal. The X arguments are Boolean expressions that will be evaluated from first to last."Y". error == 500. | eval isLocal=if(cidrmatch("123. All functions that accept numbers can accept literal numbers or any numeric field. error == 200...0/25".. The function defaults to NULL if none are true. This function returns true.ip).) cidrmatch("X". | eval description=case(error == 404. "not local") 199 .Y) Description This function takes pairs of arguments X and Y. "Not found". "OK") This example uses cidrmatch to set a field. the second is the IP address to match.132.. the corresponding Y argument will be returned.32.Round and format values functions Usage   All functions that accept strings can accept literal strings or any field. when IP address Y belongs to a particular subnet X. to "local" if the field ip matches the subnet. "local".. When the first X expression is encountered that evaluates to TRUE. Example(s) This example returns descriptions for the corresponding http status code: . If...0/25". | where cidrmatch("123. | where like(field. that takes the value of either clientip or ipaddress. . This example defines a new field called ip. the result evaluates to the third argument Z. PATTERN) This function takes two arguments. X evaluates to FALSE. "foo%") 200 . depending on which is not NULL (exists in that event): ... "Error") If X evaluates to TRUE. It returns TRUE if and only if the first argument is like the SQLite pattern in Y. "not a foo") or . "yes a foo".ipaddress) if(X. otherwise returns err=Error: X must be a Boolean expression.. | eval is_a_foo=if(like(field. like(TEXT.) This function takes an arbitrary number of arguments and returns the first value that is not null.32.. Let's say you have a set of events where the IP address is extracted to either clientip or ipaddress..This example uses cidrmatch as a filter: . the result is the second argument Y. | eval ip=coalesce(clientip. The first argument err=OK if error=200.Z) This function takes three This example looks at the values of error and returns arguments.Y. ip) coalesce(X.132. as well as % characters This example returns islike=TRUE if the field value starts with foo: .. a string to match TEXT and a match expression string PATTERN. "foo%")..... "OK". | eval err=if(error == 200.. The pattern language supports exact text match. . and returns NULL if X = Y. "REGEX") This function compares the regex string REGEX to the value of SUBJECT and returns a Boolean value.\d{1.\d{1.. X and Y. Otherwise it returns X.\d{1. searchmatch(X) This function takes one argument X.3}\.Y) This function is used to compare fields. It returns true if the REGEX can find a match against any substring of SUBJECT. . setting a field to NULL clears its value. 1... | eval n=if(match(field..) This function takes pairs of This example runs a simple check for valid ports: 201 . The function takes two . The function returns true IF AND ONLY IF the event matches the search string.. | eval n=nullif(fieldA.Y.for wildcards and _ characters for a single character match. 0) null() This function takes no arguments and returns NULL. Note that the example uses ^ and $ to perform a full match. which is a search .3}\.3}\. | eval n=searchmatch("foo AND bar") string.3}$"). nullif(X. match(SUBJECT. "^\d{1... The evaluation engine uses NULL to represent "no value".. This example returns true IF AND ONLY IF field matches the basic pattern of an IP address.fieldB) arguments. validate(X. Use the trim function to remove leading or trailing spaces. BASE can be 2 to 36. | eval n=tonumber("0A4". port >= 1 AND port <= 65535. for example if the value contains a leading and trailing space. "ERROR: Port returns the string Y is out of range") corresponding to the first expression X that evaluates to False and defaults to NULL if all are True. and defaults to 10. Boolean expressions .345..Y) Description Examples This function converts the input This example returns "164": string NUMSTR to a number. If the input value is a number... where BASE is optional and used ..BASE) tonumber(NUMSTR) tostring(X. Conversion functions Function tonumber(NUMSTR. the function returns NULL. If tonumber cannot parse a literal string to a number. If tonumber cannot parse a field value to a number.arguments.16) to define the base of the number to convert to. "ERROR: Port is not X and strings Y. The function an integer". This function converts the input value to a string.68": . | eval n=validate(isint(port). it reformats it 202 This example returns "True 0xF 12. it returns an error. Note: When used with the eval command.tostring(totalSales."commas") formats X with commas and. | eval foo=615 | eval foo2 = tostring(foo... | eval n=tostring(1==1) + " " + tostring(15. This example returns foo=615 and foo2=00:10:15: ...| fieldformat hexadecimal. totalSales="$". | eval n=sha1(field) 203 ."hex") converts X to . "commas") corresponding string value.. the values might not sort as expected because the values are converted to ASCII. Boolean value.6789. . The underlying values are not changed with the fieldformat command.as a string. "True" or "False"... if X is a number. If the input value is a . | eval n=md5(field) sha1(X) This function computes and returns the secure hash of a string value X based on the FIPS compliant SHA-1 hash function. it returns the "hex") + " " + tostring(12345. if the number includes decimals. You must use a period between the currency value and the tostring function. This function requires at least one "duration") argument X. Cryptographic functions Function Description Example(s) md5(X) This function computes and returns the MD5 hash of a string value X. tostring(X.. rounds to nearest two decimal places.. .. Use the fieldformat command with the tostring function to format the displayed values."duration") converts seconds X to readable time format HH:MM:SS. tostring(X."commas") tostring(X. the second argument Y is optional and can This example formats the column totalSales to display values be "hex" "commas" or "duration": with a currency symbol and commas. . "%H:%M") time() This function returns the wall-clock time with microsecond resolution.. The time is represented in Unix time or in seconds since Epoch time. "%H:%M") strptime(X. "-1d@d") strftime(X. as the first argument and a relative time specifier.Y) This function takes an epochtime time. . | eval n=strftime(_time... X. | eval n=strptime(timeStr. and If timeStr is in the form... relative_time(X.Y) This function takes a time represented by a string. Y. | eval n=sha512(field) Date and Time functions Function Description Example(s) now() This function takes no arguments and returns the time that the search was started. This example returns the hour and minute from the _time field: . parses it into a timestamp using the format specified by this returns it as a timestamp: Y. as the first argument and renders it as a string using the format specified by Y. . as the second argument and returns the epochtime value of Y applied to X. "11:59".. X. .Y) This function takes an epochtime value. . | eval n=relative_time(now().. X.sha256(X) This function computes and returns the secure hash of a string value X based on the FIPS compliant SHA-256 hash function... The value of time() will be 204 . | eval n=sha256(field) sha512(X) This function computes and returns the secure hash of a string value X based on the FIPS compliant SHA-512 hash function. "no") or . | eval n=if(isnotnull(field).. . "not int") or . This is a useful check for whether or not a field (X) contains a value.."no") or ."yes". | eval n=if(isnull(field)... ..."no") or .... .. | where isnotnull(field) isnull(X) This function takes one argument X and returns TRUE if X is NULL..different for each event based on when that event was processed by the eval command. | eval n=if(isbool(field)... "int"."no") . | eval n=if(isint(field).. | eval n=if(isnum(field).. | where isbool(field) isint(X) This function takes one argument X and returns TRUE if X is an integer..."yes". Informational functions Function isbool(X) Description Example(s) This function takes one argument X and returns TRUE if X is Boolean. | where isnull(field) isnum(X) This function takes one argument X and returns TRUE if X is a number."yes".."yes". 205 . . | where isint(field) isnotnull(X) This function takes one argument X and returns TRUE if X is not NULL. ...14 * num) .or .. This example returns "NumberStringBoolInvalid": . | where isnum(field) isstr(X) This function takes one argument X and returns TRUE if X is a string. | eval n=if(isstr(field)."no") or ..9) . whose values are the absolute values of the numeric field number: ... ... exact(X) This function renders the result of a numeric eval calculation with a larger amount of precision in the 206 This example returns n=2: . | eval n=typeof(12) + typeof("string") + typeof(1==2) + typeof(badfield) Mathematical functions Function abs(X) Description Examples This function takes a number X and returns its absolute value. ceiling(X) This function rounds a number X up to the next highest integer. This example returns the absnum....."yes". | eval n=ceil(1. | eval n=exact(3.. | where isstr(field) typeof(X) This function takes one argument and returns a string representation of its type. | eval absnum=abs(number) ceil(X). .5) .. The following example returns y=e3: This function rounds a number X down to the nearest whole integer. exp(X) floor(X) ln(X) This function takes a number X and returns the exponential function eX.. | eval lnBytes=ln(bytes) log(X. The default is to round to an integer... ..Y) log(X) This function takes either one or two numeric arguments and returns the log of the first argument X . 207 This example returns n=4: .formatted output. This example returns 1: This function takes a number X and returns its natural log. pi() This function takes no arguments and returns the constant pi to 11 digits of precision. If the second argument Y is omitted. this function evaluates the log of number X with base 10..9) This example returns the natural log of the values of bytes: . | eval n=floor(1. | eval area_circle=pi()*pow(radius.. | eval y=exp(3) . ..2) using the second argument Y as the base.2) round(X..Y) This function takes one or two numeric arguments X and Y. | eval n=round(3.Y) This function takes two numeric arguments X and Y and returns XY..... | eval area_circle=pi()*pow(radius. | eval num=log(number. returning X rounded to the amount of decimal places specified by Y..2) pow(X. . and rounds that number to the appropriate number of significant figures.. | eval n=sqrt(9) Multivalue functions Function commands(X) Description This function takes a search string.. or field that contains a search string. 2) This function takes one argument X. 'stats'. | eval n=mvcount(multifield) 208 .00*1111) returns n=1110..This example returns n=2. The function returns the number of . but . and 'sort'.. | eval x=commands("search foo | stats count | sort count") returns a multivalued field X. sqrt(X) This example returns 3: . This function takes one numeric argument X and returns its square root. multivalue fields or single value fields. that contains 'search'. | eval n=sigfig(1. The arguments can be strings. (This is generally not recommended for use except for analysis of audit.) This function takes an arbitrary number of arguments and returns a multivalue result of all the values. X and returns a multivalued field containing a list of the commands used in X.. . a number. mvappend(X. | eval n=round(2. sigfig(X) 1.555..00*1111 = 1111.. "middle value".56: . last_values) mvcount(MVFIELD) This function takes a field MVFIELD..log events...) Example(s) .. | eval fullName=mvappend(initial_values.... The Boolean expression X This example returns all of the values can reference ONLY ONE field at a time. mvfind(MVFIELD.. field MVFIELD and numbers STARTINDEX and ENDINDEX. "err\d+") Since indexes start at zero. | eval s=mvdedup(mvfield) This function filters a multivalue field based on an arbitrary Boolean expression X. use the expression: mvfilter(x!=NULL). if it exists: .. | eval n=mvfilter(match(email. the field x as well.."REGEX") mvindex(MVFIELD. in field email that end in .values if it is a multivalue. .STARTINDEX.net or . | eval n=mvfind(mymvfield. 1 if it is a single value field. 2) . If no values match. the index of the first matching value is returned (beginning with zero)... and NULL otherwise. mvdedup(X) mvfilter(X) This function takes a multivalue field X and returns a multivalue field with its duplicate values removed.STARTINDEX) This function tries to find a value in multivalue field X that matches the regular expression REGEX. "\. this example returns the third value in "multifield". If a match exists..net$") OR match(email.. This function takes two or three arguments.. | eval n=mvindex(multifield. ENDINDEX) mvindex(MVFIELD. and returns a subset of the multivalue field using the indexes provided. 209 .org$")) values. NULL is returned.org: Note:This function will return NULL values of . If you don't want the NULL "\. endindex is inclusive and optional. where -1 is the last element.Y. it returns only the value at startindex. an ending number Y (exclusive). This function uses a multivalue field X and returns a multivalue field with the values sorted lexicographically. If the indexes are out of range or invalid.2) . mvjoin(MVFIELD. | eval n=mvjoin(foo... This function can contain up to three arguments: a starting number X.For mvindex(mvfield. 9. 210 This example joins together the individual values of "foo" using a semicolon as the delimiter: . startindex. [endindex]). This function creates a multivalue field for a range of numbers.Z) mvsort(X) This function takes two arguments.11...") This example returns a multivalue field with the values 1.STR) mvrange(X. | eval s=mvsort(mvfield) . the result is NULL. ". Both startindex and endindex can be negative. If the increment is a timespan such as '7'd. 5. and an optional step increment Z. . multivalue field MVFIELD and string delimiter STR. The function concatenates the individual values of MVFIELD with copies of STR in between as separators. 3.. | eval mv=mvrange(1. the starting and ending numbers are treated as epoch times. 7. If endindex is not specified.. and returns the min. | eval n=max(1. strings are greater than numbers. X and Y. | eval n=min(1..Y.."Z") This function takes two multivalue fields. depending on the value of field: .. This is similar to Python's zip command.mvzip(X. Function Description Example(s) This function takes an arbitrary number of numeric or string max(X.. 3.. and returns the max.. chart. 7. and combines them by stitching together the first value of X with the first value of field Y. field) . a comprehensive set of statistical functions is available to use with the stats. "foo". depending on the value of field: . random() This function takes no arguments and returns a pseudo211 This example returns either "foo" or field. . 6.. field) This example returns either 1 or field. then the second with the second. and related commands. This function takes an arbitrary number of numeric or string min(X. and so on. 6.) arguments. The third argument...) arguments.ports) Statistical functions In addition to these functions. 3. "foo". The default delimiter is a comma. strings are greater than numbers. is optional and is used to specify a delimiting character to join the two values. Z.. 7.. | eval nserver=mvzip(hosts.. . If Y is not specified.Y) ltrim(X) . rtrim(X.. " Z") .Y) This example returns the value provided by the field username in lowercase. | eval n=replace(date. spaces and tabs . This function takes one or two arguments X and Y and returns X with the characters in Y trimmed from the right side. lower(X) This function takes one string argument and returns the lowercase version. "^(\d{1... ltrim(X..2})/".. The upper() function also exists for returning the uppercase version. | eval username=lower(username) This function takes one or two arguments X and Y and returns X with This example returns x="abcZZ": the characters in Y trimmed from the left side.2})/(\d{1.. | eval n=rtrim(" ZZZZabcZZ ".random integer ranging from zero to 231-1. The third argument Z can also reference groups that are matched in the regex. so if the input was 1/14/2015 the return value would be 14/1/2015: . This function returns a string formed by substituting string Z for every occurrence replace(X..Y. If Y is not specified. " Z") are removed. | eval x=ltrim(" ZZZZabcZZ ".Z) of regex string Y in string X. spaces 212 This example returns date with the month and day numbers switched. for example: 0…2147483647 Text functions Function Description Examples len(X) This function returns the character length of a string X.. | eval n=len(field) rtrim(X) . "\2/\1/") This example returns n="ZZZZabc": .. -3) The indexes follow SQLite semantics. This example returns the values of locDesc elements: ."Y") substr(X. with the number of characters specified returning "string": by Z..desc.product. It splits the value(s) of X on the delimiter Y and returns X as a multivalue field. it needs quotes.and tabs are removed. If Y is a field name (with values that are the location paths). 213 . it doesn't need quotes.hashtags") . 3) + of the string. "entities. 1. "vendorProductSet. starting at the index specified by Y This example concatenates "str" and "ing" together.Y.Z) This function takes two arguments: an input source field X and an spath expression Y.") This function takes either two or three arguments."Y"). If Y is a literal string. Negative indexes can be used to indicate a start from the end of the string.. | eval n=split(foo. | eval locDesc=spath(_raw. field X and delimiting character Y... It returns a substring of X. substr("string". spath(X. Read more about the spath search command. it returns the rest . they start at 1. If Z is not given.. This may result in a multivalued field. | eval n=substr("string". This function takes two arguments. that is the XML or JSON formatted location path to the value that you want to extract from X.locDesc") This example returns the hashtags from a twitter event: index=twitter | eval output=spath(_raw.Y) split(X. spath(X.. ". where X is a string and Y and Z are numeric. | eval degrees=acos(0)*180/pi() . The lower() function also exists for returning the lowercase version. asin(X) This function computes the arc sine of X.. . This example returns "abc": . spaces and tabs are removed. 214 ..+pi/2] radians. | eval n=asin(1) ..Y) trim(X) This function takes one string argument and returns the uppercase version... in the interval [0. If Y is not specified. | eval n=urldecode("http%3A%2F%2Fwww.splunk. | eval n=acos(0) ..com/download?r=header": .pi] radians... in radians. | eval n=trim(" ZZZZabcZZ ".This function takes one or two arguments X and Y and returns X with the characters in Y trimmed from both sides.. | eval n=upper(username) This example returns "http://www..... trim(X. acosh(X) This function computes the arc hyperbolic cosine of X. " Z") This example returns the value provided by the field username in uppercase.... | eval n=acosh(2) . | eval degrees=asin(1)*180/pi() .splunk.com %2Fdownload%3Fr%3Dheader") Trigonometry and Hyperbolic functions Function Description Examples acos(X) This function computes the arc cosine of X. in the interval [-pi/2. upper(X) urldecode(X) This function takes one URL string argument X and returns the unescaped or decoded URL string. X) This function computes the arc tangent of Y... 0.. .. | eval n=cos(pi()) cosh(X) This function computes the hyperbolic cosine of X radians. .500) . | eval n=atanh(0. hypot(X.50) atan2(Y. .asinh(X) This function computes the arc hyperbolic sine of X. .. | eval n=cos(-1) cos(X) This function computes the cosine of an angle of X radians.75) To compute the value. . | eval n=cosh(1) This function computes the hypotenuse of a right-angled triangle whose legs are X and Y. atanh(X) This function computes the arc hyperbolic tangent of X. .... X is the value that represents the proportion of the xcoordinate. | eval n=atan2(0. in radians.4) The function returns the square root of the sum of the squares of X and Y.... | eval n=atan(0. X in the interval [-pi. in the interval [pi/2.... | eval n=sin(1) sin(X) This function computes the sine.. .. as described in the Pythagorean theorem.Y) . Y is a value that represents the proportion of the ycoordinate. | eval n=sin(90 * pi()/180) 215 . | eval n=sinh(1) .. sinh(X) This function computes the hyperbolic sine.+pi] radians..50. the function takes into account the sign of both arguments to determine the quadrant.+pi/2] radians.. | eval n=hypot(3. . in radians.. | eval n=asinh(1) atan(X) This function computes the arc tangent of X.. | eval n=tanh(1) 216 . .... | eval n=tan(1) tanh(X) This function computes the hyperbolic tangent..tan(X) This function computes the tangent. . Hands-on Lab Please take a look at Loan csv file. Use that file and some of the functions in the table in the manual Take a look at round and some other functions that are very popular 217 . End of Module Quiz Please refer to virtual machine for quiz 218 . Create and use field aliases. Create and use calculated fields.Creating Field Aliases and Calculated Fields      Define naming conventions Create and use field aliases Create and use calculated fields Hands on Lab covering: Define naming conventions. End of Module Hands on Quiz 219 .Module 11 . it's up to you to come up with a naming convention for the reports produced by your team. summary-index-populating) Platform: Corresponds to the platform subjected to the search Category: Corresponds to the concern areas for the prevailing platforms. Time interval: The interval over which the search runs (or on which the search runs. if it is a scheduled search). Ensures the search name is unique.Set up a naming convention for reports You work in the systems engineering group of your company. limited to one or two words if possible. Group Search type Platform SEG NEG OPS NOC Alert Report Summary Windows iSeries Network Category Time interval Description Disk <arbitrary> Exchange SQL Event log CPU Jobs Subsystems 220 <arbitrary> . Description: A meaningful description of the context and intent of the search.Define naming conventions Example . and as the knowledge manager for your Splunk Enterprise implementation. report. In the end you develop a naming convention that pulls together:       Group: Corresponds to the working group(s) of the user saving the search. Search type: Indicates the type of search (alert. Services Security Possible reports using this naming convention:    SEG_Alert_Windows_Eventlog_15m_Failures SEG_Report_iSeries_Jobs_12hr_Failed_Batch NOC_Summary_Network_Security_24hr_Top_src_ip 221 . Important: Field aliasing is performed after key/value extraction but before field lookups. To alias fields: 1. (We recommend using the latter directory if you want to make it easy to transfer your data customizations to other index servers. This process enables you to search for the original field using any of its aliases. This can be helpful if there are one or more fields in the lookup table that are identical to fields in your data. which you edit in $SPLUNK_HOME/etc/system/local/.Create and use field aliases You can create multiple aliases for a field. The original field is not removed.conf: FIELDALIAS-<class> = <orig_field_name> AS <new_field_name>    <orig_field_name> is the original name of the field. Therefore. or your own custom app directory in $SPLUNK_HOME/etc/apps/. Add the following line to a stanza in props.) Note: Splunk Enterprise's field aliasing functionality does not currently support multivalue fields.conf. you can specify a lookup table based on a field alias. 222 . You add your field aliases to props. You can include multiple field alias renames in one stanza. but have been named differentlyYou can define aliases for fields that are extracted at index time as well as those that are extracted at search time. <new_field_name> is the alias to assign to the field. 2.3}\. you would add a line that defines "ipaddress" as an alias for "ip." In the props.3}) FIELDALIAS-extract_ip = ip AS ipaddress When you set up the lookup in props. Restart Splunk Enterprise for your changes to take effect. Example of field alias additions for a lookup Say you're creating a lookup for an external static table CSV file where the field you've extracted at search time as "ip" is referred to as "ipaddress.3}\.\d{1.3}\.conf file where you've defined the extraction.\d{1." as follows: [accesslog] EXTRACT-extract_ip = (?<ip>\d{1.conf.\d{1. you can just use ipaddress where you'd otherwise have used ip: [dns] lookup_ip = dnsLookup ipaddress OUTPUT host 223 . Region. take this example search .csv Description=Deep Note: In the next section we show you how the Description calculated field would be set up in props. which examines earthquake data and classifies quakes by their depth by creating a new Description field: source=eqs7day-M1.Create and use calculated fields The eval command is immensely versatile and useful. you could define the eval expression for the Description field in props. Depth. you can cut out the eval expression entirely and reference the field like you would any other extracted field. Depth>70 AND Depth<=300. 224 .conf and write the search as: source=eqs7day-M1. you may find that retyping the expression accurately in search after search is tedious business.csv | eval Description=case(Depth<=70. If you find that you need to use a particularly long and complex eval expression on a regular basis.csv | table Datetime. "Shallow". when you're writing out a search. Description You can now search on Description as if it is any other extracted field. When you run the search. "Mid". Region. Calculated fields enable you to define fields with eval expressions in props. For example.conf. You can also run searches like this: source=eqs7day-M1.conf . they often can be quite complex. Depth>300 AND Depth<=700. This is where calculated fields come to the rescue. Splunk Enterprise will find the calculated field key in props. Depth. But while some eval expressions are relatively simple. the fields will be extracted at search time and will be added to the events that include the fields in the eval expressions. Description Using calculated fields. "Deep") | table Datetime. Then.conf and evaluate it for every event that contains a Depth field. Hands on Lab: To create a calculated field go to: Settings -> Fields -> Add new (under Calculated Fields sections)  Sourcetype : csv  Name of field : a_test  Eval function: annual_inc * 2 Save it When you bring up the csv sourcetype in search . you will see the field a_test doubled the amount of annual_inc Now you can try other calculated fields. if you like 225 . End of Module Hands on Quiz Please refer to virtual machine for test 226 . Module 12 .Creating Field Extractions    Perform field extractions using Field Extractor Hands on Lab covering: Perform field extractions using Field Extractor End of Module Hands on Quiz 227 . Perform field extractions using Field Extractor As Splunk Enterprise processes events. 228 . are tied to a specific source. timestamps. You can disable field discovery to improve search performance. This process is called field extraction. and several other default fields when it indexes incoming events. source. Splunk Enterprise automatically extracts some fields Splunk Enterprise extracts some fields from your events without assistance. if you create an ip field extraction. Splunk Enterprise comes with several field extraction configurations that use regular expressions to identify and extract fields from event data. Splunk Enterprise uses pattern-matching rules called regular expressions to extract those fields as complete k/v pairs. it extracts fields from them. It also extracts fields that appear in your event data as key=value pairs. When fields appear in events without their keys. This process of recognizing and extracting k/v pairs is called field discovery. you might tie the extraction configuration for ip to sourcetype=access_combined. It automatically extracts host. but which is not automatically discovered and extracted by Splunk Enterprise. including custom field extractions. create additional field extractions. sourcetype. For example. With a properly configured regular expression. create custom field extractions To use the power of Splunk Enterprise search. All field extractions. To get all of the fields in your data. Any field extraction configuration you provide must include a regular expression that tells Splunk Enterprise how to find the field that you want to extract. and sourcetype values. Custom field extractions allow you to capture and track information that is important to your needs. Splunk Enterprise can extract user_id=johnz from the previous sample event. or host value. ensure that you are familiar with the formats and patterns of the event data associated with the source. One way is to investigate the predominant event patterns in your data with the Patterns tab. or host that you are working with. Reliable means that the method value is always followed by the URI value.1" 200 75017 "-" "Mozilla/5.10.0)” 10.0 (compatible.com/download" "Mozilla/5.. When your events have consistent and reliable formats.216.11 %ASA-7-710006: IGMP request discarded from 10.36. the URI value is always followed by the status value.1" 200 7464 "http://www.11. Nmap Scripting Engine.[03/Jun/2014:20:49:53 -0700] "GET /wp-content/themes/aurora/style. Trident/5.css HTTP/1. an apache server web access log.14 .241.splunk.html)" While these events contain different strings and characters.194.11.36.Custom field extractions should take place at search time. you can create a field extraction that accurately captures multiple field values from them.82/1561 229 . http://nmap.51 3 Jul 15 20:13:52 10. they are formatted in a consistent manner. For contrast.org/book/nse.220. status. Windows NT 6. MSIE 9. the status value is always followed by the bytes value. and so on. bytes.0 (compatible.24.0... look at this set of Cisco ASA firewall log events: Jul 15 20:10:27 10.36.0.31 %ASA-6-113003: AAA group policy for user AmorAubrey is being set to 1 Acme_techoutbound Jul 15 20:12:42 10.28 %ASA-6-302014: Teardown TCP connection 517934 for Outside:128. Before you create custom field extractions.11. They both present values for fields such as clientIP. sourcetype.36 to 2 outside:87. and so on in a reliable order. Trident/5.253.11. 131.36. Here are two events from the same source type.1.0. but in certain rare circumstances you can arrange for some custom field extractions to take place at index time.[03/Jun/2014:20:49:33 -0700] "GET / HTTP/1. get to know your data Before you begin to create field extractions. method.135 . to Inside:10.123.124.28/8443 duration 0:05:02 bytes 297 Tunnel has been torn down (AMOSORTILEGIO) Apr 19 11:24:32 PROD-MFS-002 %ASA-4-106103: access-list fmVPN-1300 denied udp for user 'sdewilde7' 4 outside/12.130.60.4(137) -> inside1/10.157.200.154(137) hit-cnt 1 first hit [0x286364c7, 0x0] " While these events contain field values that are always space-delimited, they do not share a reliable format like the preceding two events. In order, these events represent: 1. A group policy change 2. An IGMP request 3. A TCP connection 4. A firewall access denial for a request from a specific IP Because these events differ so widely, it is difficult to create a single field extraction that can apply to each of these event patterns and extract relevant field values. In situations like this, where a specific host, source type, or source contains multiple event patterns, you may want to define field extractions that match each pattern, rather than designing a single extraction that can apply to all of the patterns. Inspect the events to identify text that is common and reliable for each pattern. Using required text in field extractions In the last four events, the string of numbers that follows %ASA-#- have specific meanings. You can find their definitions in the Cisco documentation. When you have unique event identifiers like these in your data, specify them as required text in your field extraction. Required text strings limit the events that can match the regular expression in your field extraction. Specifying required text is optional, but it offers multiple benefits. Because required text reduces the set of events that it scans, it improves field extraction efficiency and decreases the number of false-positive field extractions. 230 The field extractor utility enables you to highlight text in a sample event and specify that it is required text. Methods of custom field extraction in Splunk Enterprise As a knowledge manager you oversee the set of custom field extractions created by users of your Splunk Enterprise implementation, and you might define specialized groups of custom field extractions yourself. The ways that you can do this include:    The field extractor utility, which generates regular expressions for your field extractions. Adding field extractions through pages in Settings. You must provide a regular expression. Manual addition of field extraction configurations at the .conf file level. Provides the most flexibility for field extraction. The field extraction methods that are available to Splunk Enterprise users are described in the following sections. All of these methods enable you to create search-time field extractions. To create an index-time field extraction, choose the third option: Configure field extractions directly in configuration files. Let the field extractor build extractions for you The field extractor utility leads you step-by-step through the field extraction design process. It provides two methods of field extraction: regular expressions and delimiter-based field extraction. The regular expression method is useful for extracting fields from unstructured event data, where events may follow a variety of different event patterns. It is also helpful if you are unfamiliar with regular expression syntax and usage, because it generates regular expressions and lets you validate them. The delimiter-based field extraction method is suited to structured event data. Structured event data comes from sources like SQL databases and CSV files, and produces events where all fields are separated by a common delimiter, such as commas, spaces, or pipe characters. Regular expressions usually are not necessary for structured data events from a common source. With the regular expression method of the field extractor you can:  Set up a field extraction by selecting a sample event and highlighting fields to extract from that event. 231       Create individual extractions that capture multiple fields. Improve extraction accuracy by detecting and removing false positive matches. Validate extraction results by using search filters to ensure specific values are being extracted. Specify that fields only be extracted from events that have a specific string of required text. Review stats tables of the field values discovered by your extraction. Manually configure regular expression for the field expression yourself. With the delimiter method of the field extractor you can:    Identify a delimiter to extract all of the fields in an event. Rename specific fields as appropriate. Validate extraction results. The field extractor can only build search time field extractions that are associated with specific sources or source types in your data (no hosts). Define field extractions with the Field Extractions and Field Transformations pages You can use the Field Extractions and Field Transformations pages in Settings to define and maintain complex extracted fields in Splunk Web. This method of field extraction creation lets you create a wider range of field extractions than you can generate with the field extractor utility. It requires that you have the following knowledge.   Understand how to design regular expressions. Have a basic understanding of how field extractions are configured in props.conf and transforms.conf. If you create a custom field extraction that extracts its fields from _raw and does not require a field transform, use the field extractor utility. The field extractor can generate regular expressions, and it can give you feedback about the accuracy of your field extractions as you define them. 232 Use the Field Extractions page to create basic field extractions, or use it in conjunction with the Field Transformations page to define field extraction configurations that can do the following things.    Reuse the same regular expression across multiple sources, source types, or hosts. Apply multiple regular expressions to the same source, source type, or host. Use a regular expression to extract fields from the values of another field. The Field Extractions and Field Transformations pages define only search time field extractions. 233 Hands on Lab Please refer to Lab on desktop 234 End of Module Hands on Quiz Please refer to quiz on Virtual Machine 235 . Module 13 .  End of Module Hands on Quiz 236 .Creating Tags and Event Types  Create and use tags  Describe event types and their uses  Create an event type  Hands on Lab covering: Create and use tags. Describe event types and their uses. create and event type. Create and use tags Settings  Tags List by tag name  Click Add new 237 . find similar patterns. 238 . Events versus event types An event is a single record of activity within a log file. Event type classification There are several ways to create your own event types. An event type is applied to an event at search time if that event matches the event type definition in eventtypes. An event typically includes a timestamp and provides information about what occurred on the system being monitored or logged. and create alerts and reports. they're checked against known event types. Event types let you sift through huge amounts of data. Tag or save event types after indexing your data. When your search results come back. An event type is a user-defined field that simplifies search by letting you categorize events. Event types let you classify events that have common characteristics.Describe event types and their uses Event types are a categorization system to help you make sense of your data. or you can save any search as an event type. Define event types via Splunk Web or through configuration files.conf. then Click Save As  Event Type 239 .Create an event type You complete a search. Hands on Lab covering Please refer to Lab on desktop 240 . End of Module Hands on Quiz Please refer to quiz on virtual machine 241 . Create a SEARCH workflow action End of Module Hands on Quiz 242 . Create a GET workflow action Create a POST workflow action Create a Search workflow action Hands on Lab covering: Create a POST workflow action.Creating Workflow Actions        Describe the function of a workflow action Create a GET workflow action Hands on Lab covering: Describe the function of a workflow action.Module 14 . When selected. Launch secondary searches that use one or more field values from selected events. You can also set them up to only appear in the menus of specific fields. Or you can click Add new to create a new workflow action. you can define workflow actions that enable you to:     Perform an external WHOIS lookup based on an IP address found in an event. Perform an external search (using Google or a similar web search application) on the value of a specific field found in an event.Describe the function of a workflow action Workflow actions have a wide variety of applications. you need to give it a Name and identify its Destination app. open either in the current window or in a new one. In addition. Both methods take you to the workflow action detail page. where you define individual workflow actions. Appear either in field menus or event menus in search results. Use the field values in an HTTP error event to create a new entry in an external issue management system. On the Workflow actions page you can review and update existing workflow actions by clicking on their names. or in all field menus in a qualifying event. Define workflow actions using Splunk Web You can set up all of the workflow actions described in the bulleted list at the top of this chapter and many more using Splunk Web. To begin. or which belong to a particular event type. If you're creating a new workflow action. For example. you can define workflow actions that:    Are targeted to events that contain a specific field or set of fields. There are three kinds of workflow actions that you can set up: 243 . navigate to Settings > Fields > Workflow actions. such as a search that looks for the occurrence of specific combinations of ipaddress and http_status' field values in your index over a specific time range. which launch secondary searches that use specific field values from an event. which generate an HTTP POST request to a specified URI. This action type enables you to do things like create entries in external issue management systems using a set of relevant field values. which create typical HTML links to do things like perform Google searches on specific values or run domain name queries against external WHOIS databases.   GET workflow actions. POST workflow actions. 244 . Search workflow actions. Determine whether the workflow action applies to specific fields or event types in your data. If you identify an event type. the workflow action only appears for events that have those fields. 245 . The Label field enables you to define the text that is displayed in either the field or event workflow menu. Clicking that link performs an HTTP GET request in a browser. When you identify fields. 3. either in their event menu or field menus. the workflow action only appears in the event menus for events that belong to the event type. Set Action type to link. 6. the Fields menus. Navigate to Settings > Fields > Workflow Actions. Use Apply only to the following event types to identify one or more event types. Click New to open up a new workflow action form. allowing you to pass information to an external web resource. For Show action in determine whether you want the action to appear in the Event menu. Use Apply only to the following fields to identify one or more fields. 5. Define a Label for the action. 4. To define a GET workflow action: 1. 2. Labels can be static or include the value of relevant fields.Create a GET workflow action GET link workflow actions drop one or more values into an HTML link. If you leave it blank or enter an asterisk the action appears in menus for all fields. or Both. such as a search engine or IP lookup service. you use the name of the field enclosed by dollar signs. Similar to the Label setting. This means you can include values that have spaces between words or punctuation characters. Click Save to save your workflow action definition.7. 246 . 9. 8. In URI provide a URI for the location of the external resource that you want to send your field values to. determine whether the workflow action displays in the current window or if it opens the link in a new window. Set the Link method to get. 10. Under Open link in. Variables passed in GET actions via URIs are automatically URL encoded during transmission. when you declare the value of a field. Hands-on Lab Please refer to Lab on desktop 247 . POST requests are typically defined by a form element in HTML along with some inputs that are converted into POST arguments.Create a POST workflow action Set up a POST workflow action You set up POST workflow actions in a manner similar to that of GET link actions. 6. the Fields menus. Define a Label for the action. When you identify fields. The Label field enables you to define the text that is displayed in either the field or event workflow menu. the workflow action only appears events that have those fields. Use Apply only to the following event types to identify one or more event types. Determine whether the workflow action applies to specific fields or event types in your data. either in their event menu or field menus. 1. This means that you have to identify POST arguments to send to the identified URI. Use Apply only to the following fields to identify one or more fields. Set Action type to Link. 5. or Both. If you identify an event type. Navigate to Settings > Fields > Workflow Actions. 4. However. Labels can be static or include the value of relevant fields. For Show action in determine whether you want the action to appear in the Event menu. If you leave it blank or enter an asterisk the action appears in menus for all fields. 2. the workflow action only appears in the event menus for events that belong to the event type. 248 . 3. Click New to open up a new workflow action form. On both the key and value sides of the argument. and the value in the second field. Splunk software automatically HTTP-form encodes variables that it passes in POST link actions via URIs. Set Link method to Post. 11. 10. Under URI provide the URI for a web resource that responds to POST requests. Enter the key in the first field. Click Add another field to create an additional POST argument. Under Open link in. 9. 249 . 8. you can use field names enclosed in dollar signs to identify the field value from your events that should be sent over to the resource. Under Post arguments define arguments that should be sent to web resource at the identified URI. These arguments are key and value combinations. This means you can include values that have spaces between words or punctuation characters.7. You can define multiple key/value arguments in one POST workflow action. Click Save to save your workflow action definition. determine whether the workflow action displays in the current window or if it opens the link in a new window. If you want it to run in a view other than the current one. And as with all workflow actions. bounded by dollar signs. select that view. Be sure to set a time range for the search (or identify whether it should use the same time range as the search that created the field listing) by entering relative time modifiers in the in the Earliest time and Latest time fields. In Search string enter a search string that includes one or more placeholders for field values. you can determine whether it opens in the current window or a new one. 250 . Finally.Create a Search workflow action To set up workflow actions that launch dynamically populated secondary searches. If these fields are left blank the search runs over all time by default. For example. you might simply enter clientip=$clientip$ in that field. you start by setting Action type to search on the Workflow actions detail page. you can restrict the search workflow action to events containing specific sets of fields and/or which belong to particular event types. if you're setting up a workflow action that searches on client IP values that turn up in events. Identify the app that the search runs in. as with other workflow action types. This reveals a set of Search configuration fields that you use to define the specifics of the secondary search. Hands-on Lab Please refer to Lab on desktop 251 . End of Module Quiz Please refer to questions on virtual machine 252 . Module 15 . View fired alerts End of Module Hands on Quiz 253 .Creating and Managing Alerts      Describe alerts Create alerts View fired alerts Hands on Lab covering: Describe alerts. Create alerts. The trigger condition is whenever the search returns a result. Runs a search according to a schedule that you specify when creating the alert. you specify a condition that triggers the alert.  Scheduled alert. The trigger condition is a combination of specified results of the search within a specified time window. 254 . When you create an alert you are creating a saved search with trigger conditions for the alert. specify a throttle condition for an alert.Describe alerts An alert is an action that a saved search triggers based on the results of the search. Based on a real-time search. The following list describes the types of alerts:  Per result alert. Typically the action is an email based on the results of the search. To avoid sending out alerts too frequently. When creating an alert.  Rolling-window alert. You specify results of the search that trigger the alert. But you can also choose to run a script or to list the alert as a triggered alert in Settings. Based on a real-time search. Create alerts A scheduled alert runs periodically at a scheduled time. Click Save As > Alert. create the following search index=_internal " error " NOT debug source=*splunkd. 1. This example uses a search to track when there are too many errors in a Splunk Enterprise instance during the last 24 hours.log* earliest=-24h latest=now 2. 3. responding to a condition that triggers the alert. 255 . Specify the following values for the fields in the Save As Alert dialog box: Title: Errors in the last 24 hours Alert type: Scheduled Time Range: Run every day Schedule: At 10:00 Trigger condition: Number of Results Trigger if number of results: is Greater than 5. When the number of errors exceeds 5. From the Search Page. The alert sends an email every day at 10:00AM when the number of errors exceed the threshold. the alert sends an email with information about the conditions that triggered the alert. 256 . 6.4. Set the following email settings. Click Next. using tokens in the Subject and Message fields: To: email recipient Priority: Normal Subject: Too many errors alert: $name$ Message: There were $job.resultCount$ errors reported on $trigger_date$. 5. Click Send Email. Include: Link to Alert and Link to Results Accept defaults for all other options. When the alert triggers. it sends the following email: 257 .7. After you create the alert you can view and edit the alert in the Alerts Page. Click Save. 258 . View fired alerts Simply go to the Alerts Page on the top Toolbar 259 . Hands-on Lab Please refer to Lab on desktop 260 . End of Module Quiz Please refer to questions on virtual machine 261 . Create and use a basic macro.Module 16 . End of Module Hands on Quiz 262 .Creating and Using Macros         Describe macros Manage macros Create and use a basic macro Hands on Lab covering: Describe macros. Manage macros. Add and use arguments with a macro. Define arguments and variables for a macro Add and use arguments with a macro Hands on Lab covering: Define arguments and variable for a macro. including saved and ad hoc searches. You can also specify whether or not the macro field takes any arguments. Search macros can be any part of a search.Describe Macros Search macros are chunks of a search that you can reuse in multiple places. and do not need to be a complete command. such as an eval statement or search term. 263 . Destination app is the name of the app you want to restrict your search macro to. such as mymacro. You can create multiple search macros that have the same name but require different numbers of arguments: foo. foo(2). If a macro definition includes a leading pipe character ("|"). if mymacro required two arguments. The UI constructs the search as if the macro name were a search term. which after expansion would cause the metadata command to be incorrectly formed and therefore invalid. Definition is the string that your search macro expands to when referenced in another search. foo(1). for example. Define the search macro and its arguments Your search macro can be any chunk of your search string or search command pipeline that you want to re-use as part of another search. If the search macro requires the user to input arguments. Name is the name of your search macro. etc. If your search macro takes an argument. then the 'Definition' is expected to be an eval expression that returns a string that represents the expansion of this macro. The arguments values are then specified when the search macro is invoked. Example: "| metadata type=sources".Manage and create macros In Settings > Advanced Search > Search macros. you may not use it as the first term in searches from the UI. The UI does not do the macro expansion and cannot correctly identify the initial pipe to differentiate it from a regular search term. they are tokenized and indicated by wrapping dollar signs around the arguments. you need to indicate this by appending the number of arguments to the name. by default. your search macros are restricted to the Search app. it should be named mymacro(2).   If Eval Generated Definition? is checked. 264 . $arg1$. for example. click "New" to create a new search macro. This list should not contain any repeated elements. Note: Do NOT use the straight quote character that appears in the same key as the double quote ("). this character is located on the same key as the tilde (~). Argument names may only contain the characters: alphanumeric 'a-Z. "Apply macros to saved and ad hoc searches". Otherwise. validation is considered a success. If the validation expression is a boolean expression. If it returns null. For example. You can also reference a search macro within other search macros using this same syntax. Apply macros to saved and ad hoc searches To include a search macro in your saved or ad hoc searches.  If a macro argument includes quotes. 265 . you need to escape the quotes when you call the macro in your search. 0-9'. the string returned is rendered as the error string. use the left quote (also known as a grave accent) character. A-Z. How to invoke search macros are discussed in the following section. If the validation expression is not a boolean expression. on most English-language keyboards. underscore '_'. validation fails. Validate your argument values You can verify that the argument values used to invoke the search macro are acceptable. and the Validation Error Message is returned. and dash '-'.Arguments are a comma-delimited string of argument names. you would use: `my-macro("He said \"hello!\"")`.   Validation Expression is a string that is an 'eval' expression that evaluates to a boolean or a string. If it returns false or is null. it is expected to return a string or NULL. validation succeeds when it returns true. if you wanted to pass a quoted string as your macro's argument. Hands-on Lab Please refer to Lab on desktop 266 . End of Module Quiz Please refer to virtual machine for quiz 267 . Using Pivot  Describe Pivot  Understand the relationship between data models and pivot  Select a data model object  Hands on Lab covering: Describe Pivot.Module 17 .  End of Module Hands on Quiz. Save pivot report as a dashboard. 268 . Understand the relationship between data models and pivot.  Create a pivot report  Save pivot report as a dashboard  Hands on Lab covering: Create a pivot report. Select a data model object. Describe Pivot The Pivot tool lets you report on a specific data set without the Splunk Enterprise Search Processing Language (SPL™). How does Pivot work? It uses data models to define the broad category of event data that you're working with. and then use a drag-and-drop interface to design and generate pivots that present different aspects of that data in the form of tables. and other visualizations. identify a dataset that you want to report on. Data models and their objects are designed by the knowledge managers in your organization. 269 . They do a lot of hard work for you to enable you to quickly focus on a specific subset of event data. and then uses hierarchically arranged collections of data model objects to further subdivide the original dataset and define the attributes that you want Pivot to return results on. charts. First. and calculated fields. they let you generate statistical tables. What is a data model? A data model is a hierarchically structured search-time mapping of semantic knowledge about one or more datasets. If you are familiar with relational database design. she selects the data model that represents the category of event data that she wants to work with. such as Web Intelligence or Email Logs. and visualizations based on column and row configurations that you select. search-time field extractions. These knowledge managers understand the format and semantics of their indexed data and are familiar with the Splunk Enterprise search language. transactions.Understand the relationship between data models and pivot Data models drive the Pivot tool. To create an effective data model. This information can affect your data model architecture--the manner in which the objects that make up the data model are organized. Data models can have other uses. charts. When a Pivot user designs a pivot report. especially for Splunk Enterprise app developers. These specialized searches are used by Splunk Enterprise to generate reports for Pivot users. you must understand your data sources and your data semantics. When you plug them into the Pivot Editor. It encodes the domain knowledge necessary to build a variety of specialized searches of those datasets. Data models are composed of objects. which can be arranged in hierarchical structures of parent and child objects. Each child object represents a subset of the dataset covered by its parent object. Splunk Enterprise knowledge managers design and maintain data models. knowledge managers use knowledge object types such as lookups. Then she selects an object within that data model that represents the specific dataset on which she wants to report. 270 . think of data models as analogs to database schemas. In building a typical data model. They enable users of Pivot to create compelling reports and dashboards without designing the searches that generate them. Objects break down into four types. and transaction objects in data models are collectively referred to as "root objects. transaction objects. Here are some basic facts about data model objects:     An object is a specification for a dataset. Data model objects are defined by characteristics that mostly break down into constraints and attributes. and child objects. search." Child objects have inheritance. Each data model object corresponds in some manner to a set of data in an index. You can apply data models to different indexes and get different datasets. 271 . Objects in data models can be arranged hierarchically in parent/child relationships. The top-level event. Child objects inherit constraints and attributes from their parent objects and have additional constraints and attributes of their own.Data Model Object Data models are composed of one or more objects. search objects. Objects are hierarchical. These types are: Event objects. Hands-on Lab Please refer to Lab on desktop 272 . End of Module Quiz Please refer to virtual machine for Quiz 273 . End of Course Quiz Please refer to virtual machine for Quiz 274 . Documents Similar To Splunk BookSkip carouselcarousel previouscarousel nextSplunkLive2012 Workshop ArchitectureSplunk for Monitoring and Auditing Active DirectorySplunk SiemSplunk Lab ManualSplunk-6.2.2-AdminIntegrating Splunk With ArcsightExploring SplunkSplunk Use Case Framework Introduction SessionSplunkSplunk-6.3.1-Admin.pdfE-Book- Nothing Splunked, Nothing Gained- Profiles of Splunk Customer SuccessSplunkSplunk Ppt satinder singh sandhu Splunk Use Case Library 2016-09-29Splunk Best PracticesSplunk and the SANS Top 20 Critical Security Controls9781782173830_Mastering_Splunk_Sample_ChapterSplunk Use Cases WebinarCheatsheetSplunk Admin42 Ver1.1Splunk ActiveDirectory 1.1.4 DeployADUsingSplunk5 SlidesSplunk Quick Reference GuideUsing Splunk 6 LabsDiscovering Security Events Interest Splunk 34272Splunk Forwarders Tech BriefSplunk Notes for TestingMonitor Splunk SearchesSearch Report 42Splunk-4.1.5-AdminMore From DSunte WilsonSkip carouselcarousel previouscarousel nextArchitecting on AWS 3 Day OutlineRuby Extra ProgrammingGirls Manual 2017255043272 Team Building ActivitiesProject Leadership Building High Performance Teams - Day 1Hbase IntroOracle Database SQL FundamentalsOracle 12c RAC Training Class.docxBanquet Menu 201602 Computer Hardwares Basic - Inside the BoxBusiness Communication Part 1OrganizationsAgile Scrum SlidesOperations Role for Tecsys Elite Systems 9Trade Show ExpensesLesson 6Siebel Admin Day 2Puppet TalkAs 400 User GuideMore As400 Info(1)Interview TestModule 3 Library ListsLab1 SampleYarnIseries PresentationLab17 SampleHive TutorialMapReduce-TeamAOracle SBC Configuration and Administration 5 Day Course OutlineFooter MenuBack To TopAboutAbout ScribdPressOur blogJoin our team!Contact UsJoin todayInvite FriendsGiftsLegalTermsPrivacyCopyrightSupportHelp / FAQAccessibilityPurchase helpAdChoicesPublishersSocial MediaCopyright © 2018 Scribd Inc. .Browse Books.Site Directory.Site Language: English中文EspañolالعربيةPortuguês日本語DeutschFrançaisTurkceРусский языкTiếng việtJęzyk polskiBahasa indonesiaSign up to vote on this titleUsefulNot usefulMaster your semester with Scribd & The New York TimesSpecial offer for students: Only $4.99/month.Master your semester with Scribd & The New York TimesRead Free for 30 DaysCancel anytime.Read Free for 30 DaysYou're Reading a Free PreviewDownloadClose DialogAre you sure?This action might not be possible to undo. Are you sure you want to continue?CANCELOK
Copyright © 2024 DOKUMEN.SITE Inc.