Log analysis

Last updated

In computer log management and intelligence, log analysis (or system and network log analysis) is an art and science seeking to make sense of computer-generated records (also called log or audit trail records). The process of creating such records is called data logging.

Contents

Typical reasons why people perform log analysis are:

Logs are emitted by network devices, operating systems, applications and all manner of intelligent or programmable devices. A stream of messages in time sequence often comprises a log. Logs may be directed to files and stored on disk or directed as a network stream to a log collector.

Log messages must usually be interpreted concerning the internal state of its source (e.g., application) and announce security-relevant or operations-relevant events (e.g., a user login, or a systems error).

Logs are often created by software developers to aid in the debugging of the operation of an application or understanding how users are interacting with a system, such as a search engine. The syntax and semantics of data within log messages are usually application or vendor-specific. The terminology may also vary; for example, the authentication of a user to an application may be described as a log in, a logon, a user connection or an authentication event. Hence, log analysis must interpret messages within the context of an application, vendor, system or configuration to make useful comparisons to messages from different log sources.

Log message format or content may not always be fully documented. A task of the log analyst is to induce the system to emit the full range of messages to understand the complete domain from which the messages must be interpreted.

A log analyst may map varying terminology from different log sources into a uniform, normalized terminology so that reports and statistics can be derived from a heterogeneous environment. For example, log messages from Windows, Unix, network firewalls, and databases may be aggregated into a "normalized" report for the auditor. Different systems may signal different message priorities with a different vocabulary, such as "error" and "warning" vs. "err", "warn", and "critical".

Hence, log analysis practices exist on the continuum from text retrieval to reverse engineering of software.

Functions and technologies

Pattern recognition is a function of selecting incoming messages and compare with a pattern book to filter or handle different ways.

Normalization is the function of converting message parts to the same format (e.g. common date format or normalized IP address).

Classification and tagging is ordering messages into different classes or tagging them with different keywords for later usage (e.g. filtering or display).

Correlation analysis is a technology of collecting messages from different systems and finding all the messages belonging to one single event (e.g., messages generated by malicious activity on different systems: network devices, firewalls, servers, etc.). It is usually connected with alerting systems.

Artificial Ignorance is a type of machine learning that is a process of discarding log entries that are known to be uninteresting. Artificial ignorance is a method to detect anomalies in a working system. In log analysis, this means recognizing and ignoring the regular, common log messages that result from the normal operation of the system, and therefore are not too interesting. However, new messages that have not appeared in the logs before can signal important events, and should be therefore investigated. [1] [2] In addition to anomalies, the algorithm will identify common events that did not occur. For example, a system update that runs every week, has failed to run.

Log Analysis is often compared to other analytics tools such as application performance management (APM) and error monitoring. While much of their functionality is clear overlap, the difference is rooted in process. APM has an emphasis on performance and is utilized most in production. Error monitoring is driven by developers versus operations, and integrates into code in exception handling blocks.

See also

Related Research Articles

Simple Network Management Protocol (SNMP) is an Internet Standard protocol for collecting and organizing information about managed devices on IP networks and for modifying that information to change device behaviour. Devices that typically support SNMP include cable modems, routers, switches, servers, workstations, printers, and more.

<span class="mw-page-title-main">Keystroke logging</span> Action of recording the keys struck on a keyboard

Keystroke logging, often referred to as keylogging or keyboard capturing, is the action of recording (logging) the keys struck on a keyboard, typically covertly, so that a person using the keyboard is unaware that their actions are being monitored. Data can then be retrieved by the person operating the logging program. A keystroke recorder or keylogger can be either software or hardware.

<span class="mw-page-title-main">Intrusion detection system</span> Network protection device or software

An intrusion detection system is a device or software application that monitors a network or systems for malicious activity or policy violations. Any intrusion activity or violation is typically reported either to an administrator or collected centrally using a security information and event management (SIEM) system. A SIEM system combines outputs from multiple sources and uses alarm filtering techniques to distinguish malicious activity from false alarms.

<span class="mw-page-title-main">Network security</span> Computer network access control

Network security consists of the policies, processes and practices adopted to prevent, detect and monitor unauthorized access, misuse, modification, or denial of a computer network and network-accessible resources. Network security involves the authorization of access to data in a network, which is controlled by the network administrator. Users choose or are assigned an ID and password or other authenticating information that allows them access to information and programs within their authority. Network security covers a variety of computer networks, both public and private, that are used in everyday jobs: conducting transactions and communications among businesses, government agencies and individuals. Networks can be private, such as within a company, and others which might be open to public access. Network security is involved in organizations, enterprises, and other types of institutions. It does as its title explains: it secures the network, as well as protecting and overseeing operations being done. The most common and simple way of protecting a network resource is by assigning it a unique name and a corresponding password.

FCAPS is the ISO Telecommunications Management Network model and framework for network management. FCAPS is an acronym for fault, configuration, accounting, performance, security, the management categories into which the ISO model defines network management tasks. In non-billing organizations accounting is sometimes replaced with administration.

<span class="mw-page-title-main">Internet security</span> Branch of computer security

Internet security is a branch of computer security. It encompasses the Internet, browser security, web site security, and network security as it applies to other applications or operating systems as a whole. Its objective is to establish rules and measures to use against attacks over the Internet. The Internet is an inherently insecure channel for information exchange, with high risk of intrusion or fraud, such as phishing, online viruses, trojans, ransomware and worms.

In computing, syslog is a standard for message logging. It allows separation of the software that generates messages, the system that stores them, and the software that reports and analyzes them. Each message is labeled with a facility code, indicating the type of system generating the message, and is assigned a severity level.

An information security audit is an audit of the level of information security in an organization. It is an independent review and examination of system records, activities, and related documents. These audits are intended to improve the level of information security, avoid improper information security designs, and optimize the efficiency of the security safeguards and security processes. Within the broad scope of auditing information security there are multiple types of audits, multiple objectives for different audits, etc. Most commonly the controls being audited can be categorized as technical, physical and administrative. Auditing information security covers topics from auditing the physical security of data centers to auditing the logical security of databases, and highlights key components to look for and different methods for auditing these areas.

Systems management refers to enterprise-wide administration of distributed systems including computer systems. Systems management is strongly influenced by network management initiatives in telecommunications. The application performance management (APM) technologies are now a subset of Systems management. Maximum productivity can be achieved more efficiently through event correlation, system automation and predictive analysis which is now all part of APM.

<span class="mw-page-title-main">Security testing</span> The process of finding flaws in the security of information systems

Security testing is a process intended to reveal flaws in the security mechanisms of an information system that protect data and maintain functionality as intended. Due to the logical limitations of security testing, passing the security testing process is not an indication that no flaws exist or that the system adequately satisfies the security requirements.

Database security concerns the use of a broad range of information security controls to protect databases against compromises of their confidentiality, integrity and availability. It involves various types or categories of controls, such as technical, procedural/administrative and physical.

There are a number of security and safety features new to Windows Vista, most of which are not available in any prior Microsoft Windows operating system release.

Log management (LM) comprises an approach to dealing with large volumes of computer-generated log messages.

Windows Vista contains a range of new technologies and features that are intended to help network administrators and power users better manage their systems. Notable changes include a complete replacement of both the Windows Setup and the Windows startup processes, completely rewritten deployment mechanisms, new diagnostic and health monitoring tools such as random access memory diagnostic program, support for per-application Remote Desktop sessions, a completely new Task Scheduler, and a range of new Group Policy settings covering many of the features new to Windows Vista. Subsystem for UNIX Applications, which provides a POSIX-compatible environment is also introduced.

In computing, logging is the act of keeping a log of events that occur in a computer system, such as problems, errors or just information on current operations. These events may occur in the operating system or in other software. A message or log entry is recorded for each such event. These log messages can then be used to monitor and understand the operation of the system, to debug problems, or during an audit. Logging is particularly important in multi-user software, to have a central overview of the operation of the system.

Cisco Security Monitoring, Analysis, and Response System (MARS) was a security monitoring tool for network devices. Together with the Cisco Security Manager (CSM) product, MARS made up the two primary components of the Cisco Security Management Suite.

<span class="mw-page-title-main">Multi-factor authentication</span> Method of computer access control

Multi-factor authentication is an electronic authentication method in which a user is granted access to a website or application only after successfully presenting two or more pieces of evidence to an authentication mechanism: knowledge, possession, and inherence. MFA protects user data—which may include personal identification or financial assets—from being accessed by an unauthorized third party that may have been able to discover, for example, a single password.

<span class="mw-page-title-main">Security information and event management</span> Computer security

Security information and event management (SIEM) is a field within the field of computer security, where software products and services combine security information management (SIM) and security event management (SEM). They provide real-time analysis of security alerts generated by applications and network hardware. Vendors sell SIEM as software, as appliances, or as managed services; these products are also used to log security data and generate reports for compliance purposes. The term and the initialism SIEM was coined by Mark Nicolett and Amrit Williams of Gartner in 2005.

<span class="mw-page-title-main">Mobile security</span> Security risk and prevention for mobile devices

Mobile security, or mobile device security, is the protection of smartphones, tablets, and laptops from threats associated with wireless computing. It has become increasingly important in mobile computing. The security of personal and business information now stored on smartphones is of particular concern.

<span class="mw-page-title-main">Octopussy (software)</span> Log analysis software

Octopussy, also known as 8Pussy, is a free and open-source computer-software which monitors systems, by constantly analyzing the syslog data they generate and transmit to such a central Octopussy server. Therefore, software like Octopussy plays an important role in maintaining an information security management system within ISO/IEC 27001-compliant environments.

References

  1. "artificial ignorance: how-to guide". www.ranum.com.
  2. "Log message classification with syslog-ng [LWN.net]". lwn.net.