Trend analysis

Last updated

Trend analysis is the widespread practice of collecting information and attempting to spot a pattern. In some fields of study, the term has more formally defined meanings. [1] [2] [3]

Contents

Although trend analysis is often used to predict future events, it could be used to estimate uncertain events in the past, such as how many ancient kings probably ruled between two dates, based on data such as the average years which other known kings reigned.

Project management

In project management, trend analysis is a mathematical technique that uses historical results to predict future outcome. This is achieved by tracking variances in cost and schedule performance. In this context, it is a project management quality control tool. [4] [5]

Statistics

In statistics, trend analysis often refers to techniques for extracting an underlying pattern of behavior in a time series which would otherwise be partly or nearly completely hidden by noise. If the trend can be assumed to be linear, trend analysis can be undertaken within a formal regression analysis, as described in Trend estimation. If the trends have other shapes than linear, trend testing can be done by non-parametric methods, e.g. Mann-Kendall test, which is a version of Kendall rank correlation coefficient. Smoothing can also be used for testing and visualization of nonlinear trends.

Text

Trend analysis can be also used for word usage, how words change in the frequency of use in time (diachronic analysis), in order to find neologisms or archaisms. It relates to diachronic linguistics, a field of linguistics which examines how languages change over time. Google provides tool Google Trends to explore how particular terms are trending in internet searches. On the other hand, there are tools which provide diachronic analysis for particular texts which compare word usage in each period of the particular text (based on timestamped marks), see e.g. Sketch Engine diachronic analysis (trends). [6]

See also

Notes

  1. Trend analysis tries to predict a trend like a bull market run and ride that trend until data suggests a trend reversal (e.g. bull to bear market). Trend analysis is ramping up in popularity because moving with trends, and not against them, will lead to profit for an investor. WebFinance, Inc., Investor-TA Trend analysis Definition: a comparative analysis of a company's financial ratios over time.
  2. Iowa State University Office of Social and Economic Trend Analysis.
  3. February 2004, John Immerwahr, Public Agenda, Public Attitudes on Higher Education – A Trend Analysis, 1993 to 2003.
  4. Project Management Book of Knowledge, PMBOK, PMI, 1997, page 334.
  5. PMBOK Ready Reckoner Pages 7 and 8.
  6. Kilgarriff, Adam; Herman, Ondřej; Bušta, Jan; Kovář, Vojtěch; Jakubíček, Miloš (July 2015). "DIACRAN: a framework for diachronic analysis" (PDF). Corpus Linguistics (CL2015). Corpus Linguistics 2015. United Kingdom: Lancaster University.

Related Research Articles

Earned value management (EVM), earned value project management, or earned value performance management (EVPM) is a project management technique for measuring project performance and progress in an objective manner.

Fundamental analysis, in accounting and finance, is the analysis of a business's financial statements ; health; and competitors and markets. It also considers the overall state of the economy and factors including interest rates, production, earnings, employment, GDP, housing, manufacturing and management. There are two basic approaches that can be used: bottom up analysis and top down analysis. These terms are used to distinguish such analysis from other types of investment analysis, such as quantitative and technical.

Corpus linguistics is the study of a language as that language is expressed in its text corpus, its body of "real world" text. Corpus linguistics proposes that a reliable analysis of a language is more feasible with corpora collected in the field—the natural context ("realia") of that language—with minimal experimental interference. The large collections of text allow linguistics to run quantitative analyses on linguistic concepts, otherwise harder to quantify.

In finance, technical analysis is an analysis methodology for analysing and forecasting the direction of prices through the study of past market data, primarily price and volume. As a type of active management, it stands in contradiction to much of modern portfolio theory. The efficacy of technical analysis is disputed by the efficient-market hypothesis, which states that stock market prices are essentially unpredictable, and research on whether technical analysis offers any benefit has produced mixed results. It is distinguished from fundamental analysis, which considers a company's financial statements, health, and the overall state of the market and economy.

<span class="mw-page-title-main">Project Management Body of Knowledge</span> Body of knowledge for project management

The Project Management Body of Knowledge (PMBOK) is a set of standard terminology and guidelines for project management. The body of knowledge evolves over time and is presented in A Guide to the Project Management Body of Knowledge, a book whose seventh edition was released in 2021. This document results from work overseen by the Project Management Institute (PMI), which offers the CAPM and PMP certifications.

<span class="mw-page-title-main">Prediction</span> Statement about a future event

A prediction or forecast is a statement about a future event or about future data. Predictions are often, but not always, based upon experience or knowledge of forecasters. There is no universal agreement about the exact difference between "prediction" and "estimation"; different authors and disciplines ascribe different connotations.

Forecasting is the process of making predictions based on past and present data. Later these can be compared (resolved) against what happens. For example, a company might estimate their revenue in the next year, then compare it against the actual results creating a variance actual analysis. Prediction is a similar but more general term. Forecasting might refer to specific formal statistical methods employing time series, cross-sectional or longitudinal data, or alternatively to less formal judgmental methods or the process of prediction and resolution itself. Usage can vary between areas of application: for example, in hydrology the terms "forecast" and "forecasting" are sometimes reserved for estimates of values at certain specific future times, while the term "prediction" is used for more general estimates, such as the number of times floods will occur over a long period.

<span class="mw-page-title-main">Time series</span> Sequence of data points over time

In mathematics, a time series is a series of data points indexed in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Examples of time series are heights of ocean tides, counts of sunspots, and the daily closing value of the Dow Jones Industrial Average.

In linguistics, productivity is the degree to which speakers of a language use a particular grammatical process, especially in word formation. It compares grammatical processes that are in frequent use to less frequently used ones that tend towards lexicalization. Generally the test of productivity concerns identifying which grammatical forms would be used in the coining of new words: these will tend to only be converted to other forms using productive processes.

All telecommunications service providers perform forecasting calculations to assist them in planning their networks. Accurate forecasting helps operators to make key investment decisions relating to product development and introduction, advertising, pricing etc., well in advance of product launch, which helps to ensure that the company will make a profit on a new venture and that capital is invested wisely.

In management literature, gap analysis involves the comparison of actual performance with potential or desired performance. If an organization does not make the best use of current resources, or forgoes investment in productive physical capital or technology, it may produce or perform below an idealized potential. This concept is similar to an economy's production being below the production possibilities frontier.

<span class="mw-page-title-main">Treebank</span>

In linguistics, a treebank is a parsed text corpus that annotates syntactic or semantic sentence structure. The construction of parsed corpora in the early 1990s revolutionized computational linguistics, which benefitted from large-scale empirical data.

Predictive analytics is a form of business analytics applying machine learning to generate a predictive model for certain business applications. As such, it encompasses a variety of statistical techniques from predictive modeling and machine learning that analyze current and historical facts to make predictions about future or otherwise unknown events. It represents a major subset of machine learning applications; in some contexts, it is synonymous with machine learning.

The British National Corpus (BNC) is a 100-million-word text corpus of samples of written and spoken English from a wide range of sources. The corpus covers British English of the late 20th century from a wide variety of genres, with the intention that it be a representative sample of spoken and written British English of that time. It is used in corpus linguistics for analysis of corpora.

Corpus-assisted discourse studies is related historically and methodologically to the discipline of corpus linguistics. The principal endeavor of corpus-assisted discourse studies is the investigation, and comparison of features of particular discourse types, integrating into the analysis the techniques and tools developed within corpus linguistics. These include the compilation of specialised corpora and analyses of word and word-cluster frequency lists, comparative keyword lists and, above all, concordances.

<span class="mw-page-title-main">Hydrological model</span>

A hydrologic model is a simplification of a real-world system that aids in understanding, predicting, and managing water resources. Both the flow and quality of water are commonly studied using hydrologic models.

Demand forecasting refers to the process of predicting the quantity of goods and services that will be demanded by consumers at a future point in time. More specifically, the methods of demand forecasting entail using predictive analytics to estimate customer demand in consideration of key economic conditions. This is an important tool in optimizing business profitability through efficient supply chain management. Demand forecasting methods are divided into two major categories, qualitative and quantitative methods. Qualitative methods are based on expert opinion and information gathered from the field. This method is mostly used in situations when there is minimal data available for analysis such as when a business or product has recently been introduced to the market. Quantitative methods, however, use available data, and analytical tools in order to produce predictions. Demand forecasting may be used in resource allocation, inventory management, assessing future capacity requirements, or making decisions on whether to enter a new market.

<span class="mw-page-title-main">Sketch Engine</span> Corpus manager and text analysis software

Sketch Engine is a corpus manager and text analysis software developed by Lexical Computing CZ s.r.o. since 2003. Its purpose is to enable people studying language behaviour to search large text collections according to complex and linguistically motivated queries. Sketch Engine gained its name after one of the key features, word sketches: one-page, automatic, corpus-derived summaries of a word's grammatical and collocational behaviour. Currently, it supports and provides corpora in 90+ languages.

Native-language identification (NLI) is the task of determining an author's native language (L1) based only on their writings in a second language (L2). NLI works through identifying language-usage patterns that are common to specific L1 groups and then applying this knowledge to predict the native language of previously unseen texts. This is motivated in part by applications in second-language acquisition, language teaching and forensic linguistics, amongst others.

The usage-based linguistics is a linguistics approach within a broader functional/cognitive framework, that emerged since the late 1980s, and that assumes a profound relation between linguistic structure and usage. It challenges the dominant focus, in 20th century linguistics, on considering language as an isolated system removed from its use in human interaction and human cognition. Rather, usage-based models posit that linguistic information is expressed via context-sensitive mental processing and mental representations, which have the cognitive ability to succinctly account for the complexity of actual language use at all levels. Broadly speaking, a usage-based model of language accounts for language acquisition and processing, synchronic and diachronic patterns, and both low-level and high-level structure in language, by looking at actual language use.