Performance Regression Analysis
Comprehensive guide to detecting, analyzing, and preventing performance regressions in Go applications. This guide covers regression detection algorithms, statistical methods, automated monitoring, and remediation strategies for maintaining consistent application performance.
Table of Contents
- Introduction
- Regression Detection Framework
- Statistical Methods
- Automated Monitoring
- Trend Analysis
- Alert Systems
- Root Cause Analysis
- Remediation Strategies
- Prevention Techniques
- Best Practices
Introduction
Performance regression analysis is critical for maintaining application quality over time. This guide provides comprehensive strategies for detecting performance degradations early, analyzing their causes, and implementing effective prevention and remediation measures.
Regression Detection Framework
package main
import (
"context"
"fmt"
"math"
"sort"
"sync"
"time"
)
// RegressionAnalyzer detects and analyzes performance regressions
type RegressionAnalyzer struct {
detectors map[string]RegressionDetector
baseline *PerformanceBaseline
history *PerformanceHistory
alertManager *RegressionAlertManager
rootCause *RootCauseAnalyzer
config RegressionConfig
cache *RegressionCache
metrics *RegressionMetrics
mu sync.RWMutex
}
// RegressionConfig contains regression analysis configuration
type RegressionConfig struct {
DetectionMethods []DetectionMethod
SensitivityLevel SensitivityLevel
BaselineWindow time.Duration
ComparisonWindow time.Duration
MinDataPoints int
ConfidenceLevel float64
SignificanceLevel float64
ChangeThreshold float64
EnableAutoBaseline bool
EnableTrendAnalysis bool
EnableAlerts bool
EnableRootCause bool
MaxHistory int
UpdateInterval time.Duration
}
// DetectionMethod defines regression detection methods
type DetectionMethod int
const (
StatisticalDetection DetectionMethod = iota
ThresholdDetection
TrendDetection
ChangePointDetection
MLDetection
EnsembleDetection
)
// SensitivityLevel defines detection sensitivity
type SensitivityLevel int
const (
LowSensitivity SensitivityLevel = iota
MediumSensitivity
HighSensitivity
UltraHighSensitivity
)
// RegressionDetector performs regression detection
type RegressionDetector interface {
DetectRegression(current *PerformanceData, baseline *PerformanceBaseline) (*RegressionResult, error)
GetType() DetectionMethod
GetSensitivity() SensitivityLevel
GetAccuracy() DetectionAccuracy
UpdateBaseline(data *PerformanceData) error
}
// PerformanceData represents performance measurement data
type PerformanceData struct {
ID string
Timestamp time.Time
Metrics map[string]MetricValue
Metadata DataMetadata
Environment Environment
Build BuildInfo
Commit CommitInfo
Quality DataQuality
}
// MetricValue represents a performance metric value
type MetricValue struct {
Value float64
Unit string
Type MetricType
Samples []float64
Statistics DescriptiveStats
Confidence ConfidenceInterval
}
// MetricType defines performance metric types
type MetricType int
const (
LatencyMetric MetricType = iota
ThroughputMetric
MemoryMetric
CPUMetric
DiskIOMetric
NetworkIOMetric
CustomMetric
)
// DataMetadata contains metadata about performance data
type DataMetadata struct {
Source string
Test string
Configuration map[string]interface{}
Duration time.Duration
Iterations int
Parallelism int
Tags map[string]string
}
// Environment describes the execution environment
type Environment struct {
OS string
Architecture string
GoVersion string
CPUModel string
CPUCores int
Memory int64
Disk string
Network string
Load SystemLoad
Temperature float64
}
// SystemLoad represents system load information
type SystemLoad struct {
CPU float64
Memory float64
Disk float64
Network float64
Average LoadAverage
}
// LoadAverage represents load averages
type LoadAverage struct {
OneMin float64
FiveMin float64
FifteenMin float64
}
// BuildInfo contains build information
type BuildInfo struct {
Version string
Commit string
Branch string
Tag string
BuildTime time.Time
BuildFlags []string
}
// CommitInfo contains commit information
type CommitInfo struct {
Hash string
Author string
Message string
Timestamp time.Time
Files []string
Stats CommitStats
}
// CommitStats contains commit statistics
type CommitStats struct {
Additions int
Deletions int
Files int
Changed []string
}
// DataQuality represents data quality metrics
type DataQuality struct {
Completeness float64
Stability float64
Reliability float64
Outliers int
Noise float64
OverallScore float64
}
// DescriptiveStats contains descriptive statistics
type DescriptiveStats struct {
Count int
Mean float64
Median float64
StdDev float64
Min float64
Max float64
Percentiles map[int]float64
}
// ConfidenceInterval represents a confidence interval
type ConfidenceInterval struct {
Lower float64
Upper float64
Confidence float64
}
// PerformanceBaseline represents performance baselines
type PerformanceBaseline struct {
ID string
Metrics map[string]BaselineMetric
TimeRange TimeRange
Environment Environment
Build BuildInfo
Quality BaselineQuality
CreatedAt time.Time
UpdatedAt time.Time
Version int
}
// BaselineMetric contains baseline metric information
type BaselineMetric struct {
Name string
Statistics DescriptiveStats
Distribution Distribution
Percentiles map[int]float64
Thresholds Thresholds
Trend TrendInfo
Seasonality SeasonalInfo
}
// Distribution represents statistical distribution
type Distribution struct {
Type DistributionType
Parameters map[string]float64
GoodnessOfFit float64
}
// DistributionType defines distribution types
type DistributionType int
const (
NormalDistribution DistributionType = iota
LogNormalDistribution
ExponentialDistribution
GammaDistribution
WeibullDistribution
)
// Thresholds defines performance thresholds
type Thresholds struct {
Warning float64
Critical float64
Regression float64
Improvement float64
Adaptive bool
Method ThresholdMethod
}
// ThresholdMethod defines threshold calculation methods
type ThresholdMethod int
const (
StaticThreshold ThresholdMethod = iota
PercentileThreshold
StatisticalThreshold
AdaptiveThreshold
MLThreshold
)
// TrendInfo contains trend information
type TrendInfo struct {
Direction TrendDirection
Slope float64
Strength float64
Confidence float64
StartTime time.Time
EndTime time.Time
}
// TrendDirection defines trend directions
type TrendDirection int
const (
Stable TrendDirection = iota
Improving
Degrading
Volatile
)
// SeasonalInfo contains seasonality information
type SeasonalInfo struct {
Present bool
Period time.Duration
Amplitude float64
Phase float64
Patterns []SeasonalPattern
}
// SeasonalPattern represents seasonal patterns
type SeasonalPattern struct {
Type PatternType
Period time.Duration
Strength float64
Confidence float64
}
// PatternType defines pattern types
type PatternType int
const (
DailyPattern PatternType = iota
WeeklyPattern
MonthlyPattern
CustomPattern
)
// TimeRange represents a time range
type TimeRange struct {
Start time.Time
End time.Time
}
// BaselineQuality represents baseline quality metrics
type BaselineQuality struct {
DataPoints int
Completeness float64
Stability float64
Freshness float64
Confidence float64
OverallScore float64
}
// RegressionResult contains regression detection results
type RegressionResult struct {
ID string
Detected bool
Severity RegressionSeverity
Confidence float64
AffectedMetrics []AffectedMetric
ChangePoints []ChangePoint
Analysis RegressionAnalysis
RootCause *RootCauseInfo
Recommendations []Recommendation
Timestamp time.Time
}
// RegressionSeverity defines regression severity levels
type RegressionSeverity int
const (
MinorRegression RegressionSeverity = iota
ModerateRegression
MajorRegression
CriticalRegression
)
// AffectedMetric represents an affected performance metric
type AffectedMetric struct {
Name string
Type MetricType
BaselineValue float64
CurrentValue float64
Change Change
Significance StatisticalSignificance
Impact ImpactAssessment
}
// Change represents performance change information
type Change struct {
Absolute float64
Relative float64
Direction ChangeDirection
Magnitude ChangeMagnitude
}
// ChangeDirection defines change directions
type ChangeDirection int
const (
NoChange ChangeDirection = iota
Improvement
Degradation
)
// ChangeMagnitude defines change magnitude levels
type ChangeMagnitude int
const (
Negligible ChangeMagnitude = iota
Small
Medium
Large
Extreme
)
// StatisticalSignificance contains statistical significance information
type StatisticalSignificance struct {
PValue float64
Significant bool
TestStatistic float64
TestType StatisticalTest
EffectSize EffectSize
}
// StatisticalTest defines statistical tests
type StatisticalTest int
const (
TTest StatisticalTest = iota
MannWhitneyTest
WilcoxonTest
KSTest
PermutationTest
)
// EffectSize represents effect size measurements
type EffectSize struct {
CohensD float64
GlassD float64
R2 float64
Interpretation EffectInterpretation
}
// EffectInterpretation defines effect size interpretations
type EffectInterpretation int
const (
NegligibleEffect EffectInterpretation = iota
SmallEffect
MediumEffect
LargeEffect
VeryLargeEffect
)
// ImpactAssessment represents impact assessment
type ImpactAssessment struct {
BusinessImpact BusinessImpact
TechnicalImpact TechnicalImpact
UserImpact UserImpact
OverallSeverity ImpactSeverity
}
// BusinessImpact represents business impact
type BusinessImpact struct {
Revenue float64
Cost float64
SLA SLAImpact
Reputation ReputationImpact
Competitive CompetitiveImpact
}
// SLAImpact represents SLA impact
type SLAImpact struct {
Violated bool
Metrics map[string]float64
Penalties float64
RiskLevel RiskLevel
}
// RiskLevel defines risk levels
type RiskLevel int
const (
LowRisk RiskLevel = iota
MediumRisk
HighRisk
CriticalRisk
)
// ReputationImpact represents reputation impact
type ReputationImpact struct {
Score float64
Sentiment float64
Mentions int
Escalations int
}
// CompetitiveImpact represents competitive impact
type CompetitiveImpact struct {
Advantage float64
MarketShare float64
Position int
Differentiation float64
}
// TechnicalImpact represents technical impact
type TechnicalImpact struct {
SystemStability float64
ResourceUsage ResourceImpact
Scalability ScalabilityImpact
Maintainability float64
}
// ResourceImpact represents resource impact
type ResourceImpact struct {
CPU float64
Memory float64
Disk float64
Network float64
Efficiency float64
}
// ScalabilityImpact represents scalability impact
type ScalabilityImpact struct {
Throughput float64
Latency float64
Concurrency float64
Capacity float64
}
// UserImpact represents user impact
type UserImpact struct {
Experience UserExperience
Satisfaction float64
Retention float64
Engagement float64
}
// UserExperience represents user experience metrics
type UserExperience struct {
ResponseTime float64
Availability float64
Reliability float64
Usability float64
}
// ImpactSeverity defines impact severity levels
type ImpactSeverity int
const (
LowImpact ImpactSeverity = iota
MediumImpact
HighImpact
CriticalImpact
)
// ChangePoint represents a detected change point
type ChangePoint struct {
Timestamp time.Time
Metric string
Type ChangeType
Magnitude float64
Confidence float64
Method ChangePointMethod
Context ChangeContext
}
// ChangeType defines change point types
type ChangeType int
const (
LevelChange ChangeType = iota
TrendChange
VarianceChange
DistributionChange
)
// ChangePointMethod defines change point detection methods
type ChangePointMethod int
const (
CUSUM ChangePointMethod = iota
PELT
BinSeg
EdDivisive
KernelCPD
)
// ChangeContext provides context for change points
type ChangeContext struct {
Build BuildInfo
Commit CommitInfo
Deploy DeployInfo
Config ConfigChange
External ExternalEvent
}
// DeployInfo contains deployment information
type DeployInfo struct {
ID string
Version string
Timestamp time.Time
Changes []string
Rollback bool
}
// ConfigChange represents configuration changes
type ConfigChange struct {
Component string
Parameter string
OldValue interface{}
NewValue interface{}
Timestamp time.Time
}
// ExternalEvent represents external events
type ExternalEvent struct {
Type EventType
Description string
Timestamp time.Time
Source string
Impact float64
}
// EventType defines external event types
type EventType int
const (
InfrastructureEvent EventType = iota
NetworkEvent
DatabaseEvent
ServiceEvent
EnvironmentEvent
)
// RegressionAnalysis contains detailed regression analysis
type RegressionAnalysis struct {
Method DetectionMethod
Algorithm string
Parameters map[string]interface{}
Timeline AnalysisTimeline
Comparison ComparisonAnalysis
Trend TrendAnalysis
Correlation CorrelationAnalysis
Confidence ConfidenceAnalysis
}
// AnalysisTimeline tracks analysis timeline
type AnalysisTimeline struct {
DetectionTime time.Time
AnalysisTime time.Time
Duration time.Duration
DataPoints int
WindowSize time.Duration
}
// ComparisonAnalysis contains comparison analysis results
type ComparisonAnalysis struct {
BaselinePeriod TimeRange
ComparisonPeriod TimeRange
SampleSizes SampleSizes
Statistical StatisticalComparison
Practical PracticalComparison
}
// SampleSizes contains sample size information
type SampleSizes struct {
Baseline int
Comparison int
Adequate bool
Power float64
}
// StatisticalComparison contains statistical comparison results
type StatisticalComparison struct {
Test StatisticalTest
PValue float64
Significant bool
EffectSize EffectSize
Confidence ConfidenceInterval
}
// PracticalComparison contains practical comparison results
type PracticalComparison struct {
Threshold float64
Exceeded bool
Magnitude ChangeMagnitude
Business BusinessSignificance
}
// BusinessSignificance represents business significance
type BusinessSignificance struct {
Significant bool
Impact float64
Cost float64
Priority BusinessPriority
}
// BusinessPriority defines business priority levels
type BusinessPriority int
const (
LowPriority BusinessPriority = iota
MediumPriority
HighPriority
CriticalPriority
)
// TrendAnalysis contains trend analysis results
type TrendAnalysis struct {
LongTerm TrendInfo
ShortTerm TrendInfo
Forecast ForecastInfo
Seasonal SeasonalInfo
}
// ForecastInfo contains forecasting information
type ForecastInfo struct {
Method ForecastMethod
Horizon time.Duration
Values []float64
Confidence []ConfidenceInterval
Accuracy ForecastAccuracy
}
// ForecastMethod defines forecasting methods
type ForecastMethod int
const (
LinearTrend ForecastMethod = iota
ExponentialSmoothing
ARIMA
Prophet
MLForecast
)
// ForecastAccuracy contains forecast accuracy metrics
type ForecastAccuracy struct {
MAE float64
MAPE float64
RMSE float64
SMAPE float64
}
// CorrelationAnalysis contains correlation analysis results
type CorrelationAnalysis struct {
Metrics []MetricCorrelation
External []ExternalCorrelation
Lag []LagCorrelation
Causality []CausalityAnalysis
}
// MetricCorrelation represents correlation between metrics
type MetricCorrelation struct {
Metric1 string
Metric2 string
Coefficient float64
Significance float64
Type CorrelationType
}
// CorrelationType defines correlation types
type CorrelationType int
const (
PearsonCorrelation CorrelationType = iota
SpearmanCorrelation
KendallCorrelation
PartialCorrelation
)
// ExternalCorrelation represents correlation with external factors
type ExternalCorrelation struct {
Factor string
Coefficient float64
Significance float64
Lag time.Duration
}
// LagCorrelation represents lagged correlations
type LagCorrelation struct {
Metric string
Lag time.Duration
Coefficient float64
Significance float64
}
// CausalityAnalysis represents causality analysis results
type CausalityAnalysis struct {
Cause string
Effect string
Strength float64
Confidence float64
Method CausalityMethod
}
// CausalityMethod defines causality analysis methods
type CausalityMethod int
const (
GrangerCausality CausalityMethod = iota
TransferEntropy
PearsonCausality
CCM
)
// ConfidenceAnalysis contains confidence analysis results
type ConfidenceAnalysis struct {
Overall float64
Statistical float64
Practical float64
Temporal float64
Contextual float64
}
// RootCauseInfo contains root cause analysis information
type RootCauseInfo struct {
Identified bool
Causes []PotentialCause
Primary *PotentialCause
Investigation InvestigationInfo
Evidence []Evidence
}
// PotentialCause represents a potential root cause
type PotentialCause struct {
Type CauseType
Description string
Probability float64
Impact float64
Evidence []Evidence
Timeline CauseTimeline
}
// CauseType defines root cause types
type CauseType int
const (
CodeChange CauseType = iota
ConfigurationChange
InfrastructureChange
DataChange
ExternalDependency
ResourceContention
EnvironmentalFactor
)
// CauseTimeline tracks cause timeline
type CauseTimeline struct {
Introduced time.Time
Detected time.Time
Confirmed time.Time
Resolved *time.Time
}
// InvestigationInfo contains investigation information
type InvestigationInfo struct {
Status InvestigationStatus
Assignee string
StartedAt time.Time
UpdatedAt time.Time
Notes []InvestigationNote
Actions []InvestigationAction
}
// InvestigationStatus defines investigation status
type InvestigationStatus int
const (
NotStarted InvestigationStatus = iota
InProgress
Completed
Blocked
Cancelled
)
// InvestigationNote represents investigation notes
type InvestigationNote struct {
Author string
Content string
Timestamp time.Time
Tags []string
}
// InvestigationAction represents investigation actions
type InvestigationAction struct {
Type ActionType
Description string
Assignee string
Status ActionStatus
DueDate time.Time
CompletedAt *time.Time
}
// ActionType defines investigation action types
type ActionType int
const (
DataCollection ActionType = iota
Analysis
Testing
Verification
Remediation
)
// ActionStatus defines action status
type ActionStatus int
const (
Pending ActionStatus = iota
InProgress
Completed
Failed
Cancelled
)
// Evidence represents evidence for root causes
type Evidence struct {
Type EvidenceType
Source string
Description string
Strength float64
Timestamp time.Time
Data interface{}
}
// EvidenceType defines evidence types
type EvidenceType int
const (
MetricEvidence EvidenceType = iota
LogEvidence
TraceEvidence
ProfileEvidence
CodeEvidence
ConfigEvidence
)
// Recommendation represents analysis recommendations
type Recommendation struct {
Type RecommendationType
Priority RecommendationPriority
Description string
Actions []RecommendedAction
Impact ExpectedImpact
Effort EffortEstimate
}
// RecommendationType defines recommendation types
type RecommendationType int
const (
ImmediateAction RecommendationType = iota
Investigation
Monitoring
Prevention
Optimization
)
// RecommendationPriority defines recommendation priority
type RecommendationPriority int
const (
LowPriority RecommendationPriority = iota
MediumPriority
HighPriority
UrgentPriority
)
// RecommendedAction represents recommended actions
type RecommendedAction struct {
Description string
Type ActionType
Urgency ActionUrgency
Owner string
Deadline time.Time
}
// ActionUrgency defines action urgency levels
type ActionUrgency int
const (
LowUrgency ActionUrgency = iota
MediumUrgency
HighUrgency
CriticalUrgency
)
// ExpectedImpact represents expected impact of recommendations
type ExpectedImpact struct {
Performance float64
Reliability float64
Scalability float64
Maintainability float64
Cost float64
}
// EffortEstimate represents effort estimation
type EffortEstimate struct {
Hours float64
Complexity ComplexityLevel
Resources []RequiredResource
Timeline time.Duration
Risk RiskAssessment
}
// ComplexityLevel defines complexity levels
type ComplexityLevel int
const (
LowComplexity ComplexityLevel = iota
MediumComplexity
HighComplexity
VeryHighComplexity
)
// RequiredResource represents required resources
type RequiredResource struct {
Type ResourceType
Amount float64
Skills []string
Availability float64
}
// ResourceType defines resource types
type ResourceType int
const (
DeveloperResource ResourceType = iota
DevOpsResource
QAResource
ArchitectResource
InfrastructureResource
)
// RiskAssessment represents risk assessment
type RiskAssessment struct {
Level RiskLevel
Factors []RiskFactor
Mitigation []MitigationStrategy
Probability float64
Impact float64
}
// RiskFactor represents risk factors
type RiskFactor struct {
Description string
Probability float64
Impact float64
Mitigation string
}
// MitigationStrategy represents mitigation strategies
type MitigationStrategy struct {
Description string
Effectiveness float64
Cost float64
Timeline time.Duration
}
// PerformanceHistory manages performance history data
type PerformanceHistory struct {
data map[string][]*PerformanceData
indices map[string]*TimeIndex
config HistoryConfig
storage HistoryStorage
cache *HistoryCache
mu sync.RWMutex
}
// HistoryConfig contains history configuration
type HistoryConfig struct {
MaxEntries int
RetentionPeriod time.Duration
CompressionEnabled bool
IndexingEnabled bool
CacheSize int
CacheTTL time.Duration
}
// TimeIndex provides time-based indexing
type TimeIndex struct {
entries []TimeEntry
sorted bool
}
// TimeEntry represents time index entries
type TimeEntry struct {
Timestamp time.Time
Index int
ID string
}
// HistoryStorage manages persistent history storage
type HistoryStorage interface {
Store(data *PerformanceData) error
Retrieve(query HistoryQuery) ([]*PerformanceData, error)
Delete(criteria DeletionCriteria) error
Compact() error
}
// HistoryQuery represents history queries
type HistoryQuery struct {
TimeRange TimeRange
Metrics []string
Filters map[string]interface{}
Limit int
Aggregation AggregationSpec
}
// AggregationSpec specifies data aggregation
type AggregationSpec struct {
Function AggregationFunction
Interval time.Duration
GroupBy []string
}
// AggregationFunction defines aggregation functions
type AggregationFunction int
const (
MeanAggregation AggregationFunction = iota
MedianAggregation
MaxAggregation
MinAggregation
SumAggregation
CountAggregation
)
// DeletionCriteria specifies deletion criteria
type DeletionCriteria struct {
OlderThan time.Time
Filters map[string]interface{}
KeepLast int
}
// DetectionAccuracy contains detection accuracy metrics
type DetectionAccuracy struct {
Precision float64
Recall float64
F1Score float64
Specificity float64
Accuracy float64
FalsePositiveRate float64
FalseNegativeRate float64
}
// NewRegressionAnalyzer creates a new regression analyzer
func NewRegressionAnalyzer(config RegressionConfig) *RegressionAnalyzer {
analyzer := &RegressionAnalyzer{
detectors: make(map[string]RegressionDetector),
baseline: NewPerformanceBaseline(),
history: NewPerformanceHistory(),
alertManager: NewRegressionAlertManager(),
rootCause: NewRootCauseAnalyzer(),
config: config,
cache: NewRegressionCache(),
metrics: &RegressionMetrics{},
}
// Initialize detectors based on configuration
for _, method := range config.DetectionMethods {
detector := createDetector(method, config)
if detector != nil {
analyzer.detectors[detector.GetType().String()] = detector
}
}
return analyzer
}
// AnalyzeRegression performs comprehensive regression analysis
func (ra *RegressionAnalyzer) AnalyzeRegression(ctx context.Context, data *PerformanceData) (*RegressionResult, error) {
ra.mu.Lock()
defer ra.mu.Unlock()
// Validate input data
if err := ra.validateData(data); err != nil {
return nil, fmt.Errorf("data validation failed: %w", err)
}
// Check cache first
if cached := ra.cache.Get(data.ID); cached != nil {
return cached, nil
}
// Get or create baseline
baseline := ra.getOrCreateBaseline(data)
if baseline == nil {
return nil, fmt.Errorf("failed to establish baseline")
}
// Perform regression detection using multiple methods
results := make([]*RegressionResult, 0, len(ra.detectors))
for _, detector := range ra.detectors {
result, err := detector.DetectRegression(data, baseline)
if err != nil {
// Log error but continue with other detectors
continue
}
if result.Detected {
results = append(results, result)
}
}
// Combine results using ensemble method
finalResult := ra.combineResults(results, data)
// Perform root cause analysis if regression detected
if finalResult.Detected && ra.config.EnableRootCause {
rootCause, err := ra.rootCause.Analyze(ctx, data, baseline)
if err == nil {
finalResult.RootCause = rootCause
}
}
// Generate recommendations
finalResult.Recommendations = ra.generateRecommendations(finalResult)
// Store in history
ra.history.Add(data)
// Cache result
ra.cache.Set(data.ID, finalResult)
// Send alerts if enabled
if ra.config.EnableAlerts && finalResult.Detected {
ra.alertManager.SendAlert(finalResult)
}
// Update metrics
ra.updateMetrics(finalResult)
return finalResult, nil
}
// CompareWithBaseline compares current performance with baseline
func (ra *RegressionAnalyzer) CompareWithBaseline(current *PerformanceData, baseline *PerformanceBaseline) (*ComparisonResult, error) {
comparison := &ComparisonResult{
Current: current,
Baseline: baseline,
Timestamp: time.Now(),
}
// Compare each metric
for metricName, currentMetric := range current.Metrics {
if baselineMetric, exists := baseline.Metrics[metricName]; exists {
metricComparison := ra.compareMetric(currentMetric, baselineMetric)
comparison.Metrics = append(comparison.Metrics, metricComparison)
}
}
// Calculate overall comparison
comparison.Overall = ra.calculateOverallComparison(comparison.Metrics)
return comparison, nil
}
// UpdateBaseline updates the performance baseline
func (ra *RegressionAnalyzer) UpdateBaseline(data *PerformanceData) error {
ra.mu.Lock()
defer ra.mu.Unlock()
if ra.config.EnableAutoBaseline {
return ra.baseline.Update(data)
}
return nil
}
// GetPerformanceTrends analyzes performance trends
func (ra *RegressionAnalyzer) GetPerformanceTrends(timeRange TimeRange) (*TrendAnalysisResult, error) {
// Retrieve historical data
query := HistoryQuery{
TimeRange: timeRange,
Limit: 1000,
}
historicalData, err := ra.history.Query(query)
if err != nil {
return nil, fmt.Errorf("failed to retrieve historical data: %w", err)
}
if len(historicalData) < ra.config.MinDataPoints {
return nil, fmt.Errorf("insufficient data points for trend analysis")
}
// Perform trend analysis
trendResult := &TrendAnalysisResult{
TimeRange: timeRange,
DataPoints: len(historicalData),
Timestamp: time.Now(),
}
// Analyze trends for each metric
for metricName := range historicalData[0].Metrics {
trend := ra.analyzeTrendForMetric(metricName, historicalData)
trendResult.Trends[metricName] = trend
}
return trendResult, nil
}
// validateData validates performance data
func (ra *RegressionAnalyzer) validateData(data *PerformanceData) error {
if data == nil {
return fmt.Errorf("data cannot be nil")
}
if data.ID == "" {
return fmt.Errorf("data ID is required")
}
if len(data.Metrics) == 0 {
return fmt.Errorf("at least one metric is required")
}
if data.Timestamp.IsZero() {
return fmt.Errorf("timestamp is required")
}
// Validate each metric
for name, metric := range data.Metrics {
if err := ra.validateMetric(name, metric); err != nil {
return fmt.Errorf("invalid metric %s: %w", name, err)
}
}
return nil
}
// validateMetric validates a performance metric
func (ra *RegressionAnalyzer) validateMetric(name string, metric MetricValue) error {
if math.IsNaN(metric.Value) || math.IsInf(metric.Value, 0) {
return fmt.Errorf("invalid metric value")
}
if len(metric.Samples) > 0 {
for i, sample := range metric.Samples {
if math.IsNaN(sample) || math.IsInf(sample, 0) {
return fmt.Errorf("invalid sample at index %d", i)
}
}
}
return nil
}
// Helper methods and types
type RegressionMetrics struct{}
type RegressionCache struct{}
type RegressionAlertManager struct{}
type RootCauseAnalyzer struct{}
type HistoryCache struct{}
type ComparisonResult struct {
Current *PerformanceData
Baseline *PerformanceBaseline
Metrics []MetricComparison
Overall OverallComparison
Timestamp time.Time
}
type MetricComparison struct {
Name string
Current float64
Baseline float64
Change Change
Significance StatisticalSignificance
}
type OverallComparison struct {
Score float64
Regression bool
Severity RegressionSeverity
Summary string
}
type TrendAnalysisResult struct {
TimeRange TimeRange
DataPoints int
Trends map[string]TrendInfo
Timestamp time.Time
}
// Constructor functions
func NewPerformanceBaseline() *PerformanceBaseline { return &PerformanceBaseline{} }
func NewPerformanceHistory() *PerformanceHistory { return &PerformanceHistory{} }
func NewRegressionAlertManager() *RegressionAlertManager { return &RegressionAlertManager{} }
func NewRootCauseAnalyzer() *RootCauseAnalyzer { return &RootCauseAnalyzer{} }
func NewRegressionCache() *RegressionCache { return &RegressionCache{} }
// Method implementations
func createDetector(method DetectionMethod, config RegressionConfig) RegressionDetector { return nil }
func (ra *RegressionAnalyzer) getOrCreateBaseline(data *PerformanceData) *PerformanceBaseline { return nil }
func (ra *RegressionAnalyzer) combineResults(results []*RegressionResult, data *PerformanceData) *RegressionResult { return nil }
func (ra *RegressionAnalyzer) generateRecommendations(result *RegressionResult) []Recommendation { return nil }
func (ra *RegressionAnalyzer) updateMetrics(result *RegressionResult) {}
func (ra *RegressionAnalyzer) compareMetric(current MetricValue, baseline BaselineMetric) MetricComparison { return MetricComparison{} }
func (ra *RegressionAnalyzer) calculateOverallComparison(metrics []MetricComparison) OverallComparison { return OverallComparison{} }
func (ra *RegressionAnalyzer) analyzeTrendForMetric(metricName string, data []*PerformanceData) TrendInfo { return TrendInfo{} }
func (pb *PerformanceBaseline) Update(data *PerformanceData) error { return nil }
func (ph *PerformanceHistory) Add(data *PerformanceData) {}
func (ph *PerformanceHistory) Query(query HistoryQuery) ([]*PerformanceData, error) { return nil, nil }
func (rc *RegressionCache) Get(key string) *RegressionResult { return nil }
func (rc *RegressionCache) Set(key string, result *RegressionResult) {}
func (ram *RegressionAlertManager) SendAlert(result *RegressionResult) {}
func (rca *RootCauseAnalyzer) Analyze(ctx context.Context, data *PerformanceData, baseline *PerformanceBaseline) (*RootCauseInfo, error) { return nil, nil }
// String methods for enums
func (dm DetectionMethod) String() string {
switch dm {
case StatisticalDetection:
return "statistical"
case ThresholdDetection:
return "threshold"
case TrendDetection:
return "trend"
case ChangePointDetection:
return "changepoint"
case MLDetection:
return "ml"
case EnsembleDetection:
return "ensemble"
default:
return "unknown"
}
}
// Example usage
func ExampleRegressionAnalysis() {
// Create analyzer configuration
config := RegressionConfig{
DetectionMethods: []DetectionMethod{StatisticalDetection, TrendDetection, ChangePointDetection},
SensitivityLevel: MediumSensitivity,
BaselineWindow: 7 * 24 * time.Hour, // 7 days
ComparisonWindow: time.Hour,
MinDataPoints: 10,
ConfidenceLevel: 0.95,
SignificanceLevel: 0.05,
ChangeThreshold: 0.1, // 10% change threshold
EnableAutoBaseline: true,
EnableTrendAnalysis: true,
EnableAlerts: true,
EnableRootCause: true,
MaxHistory: 10000,
UpdateInterval: time.Minute,
}
// Create analyzer
analyzer := NewRegressionAnalyzer(config)
// Create sample performance data
data := &PerformanceData{
ID: "perf-001",
Timestamp: time.Now(),
Metrics: map[string]MetricValue{
"response_time": {
Value: 150.5,
Unit: "ms",
Type: LatencyMetric,
Samples: []float64{145.2, 148.7, 152.1, 149.8, 154.2},
Statistics: DescriptiveStats{
Count: 5,
Mean: 150.0,
Median: 149.8,
StdDev: 3.2,
Min: 145.2,
Max: 154.2,
},
},
"throughput": {
Value: 950.2,
Unit: "req/s",
Type: ThroughputMetric,
Samples: []float64{945.1, 952.3, 948.7, 951.8, 953.2},
Statistics: DescriptiveStats{
Count: 5,
Mean: 950.2,
Median: 951.8,
StdDev: 3.1,
Min: 945.1,
Max: 953.2,
},
},
},
Build: BuildInfo{
Version: "v1.2.3",
Commit: "abc123def456",
Branch: "main",
},
Environment: Environment{
OS: "linux",
Architecture: "amd64",
GoVersion: "1.21.0",
CPUCores: 8,
Memory: 16 * 1024 * 1024 * 1024, // 16GB
},
Quality: DataQuality{
Completeness: 1.0,
Stability: 0.95,
Reliability: 0.98,
OverallScore: 0.97,
},
}
// Analyze for regressions
ctx := context.Background()
result, err := analyzer.AnalyzeRegression(ctx, data)
if err != nil {
fmt.Printf("Regression analysis failed: %v\n", err)
return
}
fmt.Println("Regression Analysis Results:")
fmt.Printf("Regression detected: %t\n", result.Detected)
if result.Detected {
fmt.Printf("Severity: %v\n", result.Severity)
fmt.Printf("Confidence: %.2f\n", result.Confidence)
fmt.Printf("Affected metrics: %d\n", len(result.AffectedMetrics))
for _, metric := range result.AffectedMetrics {
fmt.Printf(" - %s: %.2f -> %.2f (%.2f%% change)\n",
metric.Name,
metric.BaselineValue,
metric.CurrentValue,
metric.Change.Relative*100)
}
if len(result.Recommendations) > 0 {
fmt.Println("Recommendations:")
for _, rec := range result.Recommendations {
fmt.Printf(" - %s: %s\n", rec.Type, rec.Description)
}
}
}
fmt.Printf("Analysis completed at: %v\n", result.Timestamp)
}
Statistical Methods
Advanced statistical techniques for robust regression detection.
Hypothesis Testing
Statistical hypothesis testing for performance comparisons.
Change Point Detection
Algorithms for detecting significant changes in performance metrics.
Time Series Analysis
Time series methods for trend analysis and forecasting.
Automated Monitoring
Continuous monitoring systems for real-time regression detection.
Real-time Detection
Streaming analysis for immediate regression identification.
Baseline Management
Automated baseline establishment and maintenance.
Alert Integration
Integration with alerting systems for timely notifications.
Best Practices
- Baseline Quality: Maintain high-quality baselines with sufficient data
- Multiple Methods: Use multiple detection methods for robust analysis
- Statistical Rigor: Apply proper statistical methods with appropriate thresholds
- Context Awareness: Consider environmental and deployment context
- Root Cause Analysis: Implement comprehensive root cause identification
- Automated Response: Automate common remediation actions
- Continuous Learning: Continuously improve detection algorithms
- Documentation: Maintain detailed documentation of regression events
Summary
Performance regression analysis provides critical capabilities for maintaining application quality:
- Early Detection: Identify performance regressions as soon as they occur
- Statistical Rigor: Apply robust statistical methods for reliable detection
- Root Cause Analysis: Automatically identify likely causes of regressions
- Impact Assessment: Quantify the business and technical impact
- Automated Response: Implement automated remediation for common issues
- Continuous Improvement: Learn from historical data to improve detection
These techniques enable organizations to maintain consistent application performance and quickly address issues before they impact users.