Legacy Application Assessment Services
Know What You Have Before You Rewrite It. A forensic audit of your technical debt, risks, and modernization options.
- ROI Timeframe
- 6-9 months
- Market Starting Price
- $30K - $60K
- Vendors Analyzed
- 6 Rated
- Category
- Strategy & Planning
Updated: February 2026 · Based on 445 verified engagements · Author: Peter Korpak · Independent methodology →
Should You Engage Legacy Application Assessment Services?
Engage this service if...
- → You have 20+ applications and no central inventory of which are business-critical versus zombie apps
- → A cloud migration or major modernization program is planned in the next 12 months and you need to sequence the work
- → An M&A transaction requires technical due diligence on an acquired company's application portfolio
- → A compliance audit has requested documentation of all systems running EOL software
- → A new CTO or CIO has joined and needs an honest landscape assessment before committing to a roadmap
This service is not the right fit if...
- ✗ You have fewer than 5 applications — a manual inventory is faster and more accurate
- ✗ You already know exactly what you want to modernize and the scope is defined — start execution directly
- ✗ You have no budget to act on findings — commissioning an assessment without remediation budget wastes investment
- ✗ Your portfolio was assessed within the last 12 months and no major changes have occurred
Alternative Paths
| Alternative | Why Consider It | Best For |
|---|---|---|
| Cloud Readiness Assessment | More focused on migration suitability and TCO than overall portfolio health | Organizations specifically planning cloud migration rather than broad portfolio modernization |
| Modernization Strategy Services | Combines portfolio assessment with strategic roadmap and financial modeling | Organizations needing both portfolio analysis and board-level business case |
Business Case
According to Modernization Intel's analysis, organizations that invest in legacy application assessment services typically see returns within 6-9 months, with typical savings of 15-25% Portfolio Rationalization.
Signs You Need This Service
The 'Black Box' Problem
You have 50+ apps written in Java 6 or .NET 2.0. The original developers left 5 years ago. No one knows how they work, only that they break if you touch them.
M&A Due Diligence
You just acquired a company. Their CTO says the tech is 'modern'. You need a third-party audit to prove it's actually spaghetti code before you integrate it.
Cloud Migration Stalled
You tried to 'Lift & Shift' everything to AWS. It failed because the apps weren't cloud-ready. Now you need to know which ones to refactor and which to retire.
Compliance Audit Panic
Auditors are asking for a list of all systems running EOL (End of Life) software. You don't have that list. You need an automated inventory immediately.
Sound familiar? If 2 or more of these apply to you, this service can deliver immediate value.
Business Value & ROI
Quick ROI Estimator
*Estimates based on industry benchmarks. Actual results vary by organization.
Key Metrics to Track:
Legacy Application Complexity Scorer
Answer 5 questions to assess modernization complexity. This drives strategy (rehost vs refactor vs rewrite).
Buyer's Deep Dive
The Challenge
Legacy application assessment solves a visibility problem before it becomes an execution problem: organizations that begin modernization programs without a rigorous portfolio inventory migrate the wrong applications in the wrong order, or miss critical security vulnerabilities in systems they didn’t know existed. Based on analysis of 445 engagements, organizations that skip formal assessment before modernization have a 40% higher wave failure rate and spend an average of 35% more on remediation than those with pre-migration assessments.
The CMDB accuracy problem compounds the visibility gap. Central configuration management databases become stale within 12–18 months as teams provision resources outside formal change management processes. Analysis of 445 engagements found that CMDB-only portfolio inventories miss an average of 32% of actual running applications — including “zombie applications” (running but not used) and “shadow IT” (deployed by business units without IT knowledge). These undiscovered applications are disproportionately likely to run EOL software, because they were forgotten before patching routines captured them.
The high success rate (83%) reflects that portfolio assessment is a well-understood, lower-risk engagement compared to migration execution. When it fails, the cause is almost always post-assessment: organizations commission assessments without pre-committed budget for remediation, resulting in a risk register that stays on a shelf.
How to Evaluate Providers
Legacy application assessment providers differentiate on discovery methodology, scoring rigor, and deliverable quality. The key question is: how do they find applications you don’t know about?
Discovery methodology comparison:
| Method | Application Coverage | Time | Cost Premium | Best For |
|---|---|---|---|---|
| CMDB export only | 65–70% | 1 week | None | Known portfolios, internal awareness only |
| Network scanning | 85–90% | 2–3 weeks | 20–30% | Finding shadow IT and undocumented systems |
| Agent-based discovery | 92–96% | 2–4 weeks | 30–50% | Most complete — finds all running processes |
| Hybrid (scanning + CMDB + interviews) | 95–98% | 3–5 weeks | 25–40% | Best balance of coverage and stakeholder context |
Red flags:
- Assessments that rely exclusively on CMDB exports (misses 30% of applications on average)
- No business criticality scoring methodology beyond IT perspective (business owners know which systems drive revenue, not just IT)
- T-shirt sizing effort estimates without methodology documentation (effort estimates with no rationale cannot be challenged or calibrated)
- Deliverables in read-only PDF format (assessment outputs must be editable so teams can maintain them post-engagement)
What to look for: Providers with specific automated scanning tool expertise (CAST Highlight for large portfolios, Veracode for security-focused assessments), case studies from similar portfolio sizes and industries, and deliverables that include unlocked Excel and interactive dashboards.
Implementation Patterns
Successful legacy assessments combine automated scanning with business owner interviews. Automated tools provide coverage and objectivity; interviews provide business context that tools cannot infer from code.
Phased discovery pattern:
- Automated inventory (week 1–2): Network scanning identifies all running applications. Static code analysis tools (CAST Highlight, SonarQube) produce objective code quality metrics (cyclomatic complexity, technical debt ratio, security vulnerability counts). This phase is non-disruptive — no code changes, no downtime.
- Business criticality interviews (week 2–3): Application owners scored on business impact (revenue dependency, regulatory criticality, user count), business value (strategic vs commodity), and change frequency (how often does business need updates). IT perspective alone systematically underestimates business criticality for applications used by specific business units.
- 6 R’s disposition (week 3–4): Apply the 6 R’s framework to each application using combined technical scores (from automated analysis) and business scores (from interviews). The disposition framework: Rehost (move to cloud as-is), Replatform (minor cloud optimizations), Refactor (significant code changes), Repurchase (replace with SaaS), Retire (decommission), Retain (keep on-prem, don’t migrate).
- Prioritized roadmap (week 4–6): Sequence migration waves by combining: quick wins (high value + low complexity), risk reduction (critical security vulnerabilities), and strategic alignment (applications blocking digital transformation goals).
Zombie application identification: Applications that generate network traffic but have no active business users are candidates for immediate retirement. The typical enterprise portfolio has 20–30% zombie applications — running, licensed, and patched for systems nobody needs. Retiring these before migration reduces total migration scope by 20–30% and produces immediate licensing and hosting savings.
Technical debt quantification: Assessment deliverables should include a $ estimate of technical debt, not just a “high/medium/low” rating. The SQALE method (CAST, SonarQube) estimates technical debt as the engineering hours required to bring code to acceptable quality standards. This enables direct ROI comparison: “fixing this application’s technical debt costs $400K vs replacing it with a SaaS alternative for $80K/year.”
Total Cost of Ownership
Legacy assessment is one of the highest-ROI modernization engagements because it prevents misallocation of much larger migration budgets. Based on 445 engagements, organizations that act on assessment findings avoid an average of $1.2M in migration waste (migrating wrong applications, encountering undiscovered dependencies, redundant remediation).
Cost model by portfolio size:
| Portfolio Size | Engagement Cost | Typical Findings Value | ROI Multiple |
|---|---|---|---|
| 20–50 apps | $30K–$60K | $200K–$600K avoided waste | 4–10× |
| 50–100 apps | $75K–$150K | $500K–$1.5M avoided waste | 5–10× |
| 100–300 apps | $200K–$400K | $1M–$4M avoided waste | 4–10× |
Hidden costs beyond the engagement fee:
| Cost Category | Typical Range | Notes |
|---|---|---|
| Application owner interview time | $20K–$50K | 1–2 hrs × 20–50 stakeholders at $100–$150/hr fully loaded |
| CMDB enrichment post-assessment | $15K–$40K | IT team effort to update central inventory with findings |
| Findings presentation preparation | $5K–$20K | Internal stakeholder briefings beyond the executive presentation |
Zombie app retirement savings: The median assessment finds 23% of portfolio applications are retirement candidates. For a 100-application portfolio, retiring 23 applications typically saves $300K–$800K in annual licensing, hosting, and maintenance costs — often recovering the full assessment cost within the first year.
Post-Engagement: What Happens Next
After a legacy assessment, you own an application portfolio inventory, technical debt heatmap, 6 R’s disposition plan, and a sequenced modernization roadmap. The next step is committing budget to act on findings.
Typical post-engagement sequence:
- Week 1–4 post-assessment: Review findings with executive team. Approve budget for modernization program. Assign internal application owners to each application in the disposition plan.
- Month 1–3: Retire zombie applications (immediate cost savings, zero migration risk). Begin Wave 1 (highest-priority, lowest-complexity applications).
- Month 3–12: Execute migration waves following the assessment roadmap. Reassess changed applications if major scope changes occur.
- Month 12–24: Mid-program review. Validate that TCO projections from assessment are tracking actual results. Adjust wave sequencing if business priorities have shifted.
Keeping the assessment current: Portfolio assessments become stale within 18–24 months as applications are deployed, retired, and changed. Plan a lightweight refresh assessment every 18–24 months, or after major events (M&A, major cloud migration, significant new application deployments).
Connecting assessment to execution: The assessment roadmap should directly inform your migration program’s Wave 1 scope. Organizations that let assessment deliverables sit without acting within 6 months typically re-commission assessments before acting — paying twice for the same analysis.
What to Expect: Engagement Phases
A typical legacy application assessment services engagement follows 3 phases. Timelines vary based on scope and organizational complexity.
Typical Engagement Timeline
Standard delivery phases for this service type. Use this to validate vendor project plans.
Phase 1: Automated Discovery
Duration: 1-2 weeks
Activities
- →Static Code Analysis (CAST, SonarQube)
- →Infrastructure Scanning
- →Dependency Mapping
Outcomes
- ✓Raw Inventory Data
- ✓Vulnerability Report
Typical Team Composition
Solution Architect
The 'Detective'. Digs into the code and architecture to find the truth.
Business Analyst
The 'Translator'. Maps technical complexity to business value.
Security Specialist
Checks for vulnerabilities and compliance risks.
Standard Deliverables & Market Pricing
The following deliverables are standard across qualified providers. Pricing reflects current market rates based on Modernization Intel's vendor analysis.
Standard SOW Deliverables
Don't sign a contract without these. Ensure your vendor includes these specific outputs in the Statement of Work:
All deliverables are yours to keep. No vendor lock-in, no proprietary formats. Use these assets to execute internally or with any partner.
Engagement Models: Choose Your Path
Based on data from 200+ recent SOWs. Use these ranges for your budget planning.
Portfolio Assessment (20-50 apps). Automated scanning + high-level roadmap. 4-6 weeks.
What Drives Cost:
- Number of systems/applications in scope
- Organizational complexity (business units, geo locations)
- Timeline urgency (standard vs accelerated delivery)
- Stakeholder involvement (executive workshops, training sessions)
Flexible Payment Terms
We offer milestone-based payments tied to deliverable acceptance. Typical structure: 30% upon kickoff, 40% at mid-point, 30% upon final delivery.
Hidden Costs Watch
- • Travel: Often billed as "actuals" + 15% admin fee. Cap this at 10% of fees.
- • Change Orders: "Extra meetings" can add 20% to the bill. Define interview counts rigidly.
- • Tool Licensing: Watch out for "proprietary assessment tool" fees added on top.
Independently Rated Providers
The following 6 vendors have been independently assessed by Modernization Intel for legacy application assessment services capability, scored on methodology transparency, delivery track record, pricing clarity, and specialization fit.
Why These Vendors?
Vetted Specialists| Company | Specialty | Best For |
|---|---|---|
CAST Software | Automated Code Analysis (CAST Highlight) | Large portfolios (100+ apps) needing rapid assessment |
Micro Focus | Legacy Language Experts (COBOL, Fortran, PL/I) | Mainframe and midrange application analysis |
vFunction | Java Monolith Decomposition Analysis | Assessing Java apps for microservices migration |
Software Improvement Group (SIG) | Software Health Assessment (ISO 25010) | M&A due diligence and quality benchmarking |
Modernizing Medicine | Healthcare Legacy Systems | HIPAA-compliant healthcare application assessment |
Thoughtworks | Strategic Modernization Assessment | Combining technical + business value analysis |
Vendor Evaluation Questions
- What automated scanning tools do you use — CAST Highlight, SonarQube, Veracode, or proprietary tools?
- How do you discover applications not in the CMDB — what network scanning methodology do you use?
- How do you score business criticality — what data sources beyond IT do you consult?
- What is your 6 R's (Rehost/Replatform/Refactor/Repurchase/Retire/Retain) methodology?
- How do you quantify the technical debt finding — what is your $ methodology for debt estimation?
- What deliverables do we own after the engagement — are reports unlocked and editable?
- How do you calibrate effort estimates — what is your T-shirt sizing methodology?
Reference Implementation
Global manufacturer had 400+ applications across 20 factories. No central inventory. Ransomware attack took down a critical plant because of an unpatched Windows 2003 server no one knew about.
Conducted automated portfolio assessment. Identified 120 'Zombie Apps' (running but not used) and 50 critical security risks.
- → → Retired 120 apps (saving $2M/year in licensing/hosting)
- → → Patched/Ring-fenced all critical vulnerabilities in 30 days
- → → Created 3-year modernization roadmap for core ERP
Frequently Asked Questions
Q1 Do legacy application assessment services use automated tools?
Yes, we use static analysis tools like CAST, SonarQube, and Micro Focus Enterprise Analyzer to scan code, but Senior Architects interpret the results to find architectural flaws and business logic that tools miss. Tools find syntax errors; architects find strategic problems.
Q2 What if we don't have documentation for our legacy applications?
That's normal and expected. We specialize in 'Software Archaeology' - reading code to reverse-engineer business logic without docs. 80% of our clients have zero documentation. We interview remaining SMEs and use code analysis to reconstruct how systems actually work.
Q3 How much do legacy application assessment services cost?
$30K-$400K depending on scope. Single app deep dive (2-3 weeks) = $30K-$60K. Portfolio assessment (20-50 apps, 4-6 weeks) = $75K-$150K. Full IT landscape (100+ apps, 8-12 weeks) = $200K-$400K. ROI: Typical clients retire 15-25% of portfolio, saving millions in maintenance costs.
Q4 How long does a legacy application assessment take?
2-12 weeks depending on complexity. Single application = 2-3 weeks. Portfolio (20-50 apps) = 4-6 weeks. Enterprise landscape (100+ apps) = 8-12 weeks. We deliver assessments quickly because you need data to make budget decisions - not 6-month analysis paralysis projects.
Q5 What happens after the assessment?
You get a prioritized roadmap with 6 R's disposition (Rehost/Replatform/Refactor/Repurchase/Retire/Retain) for each app. You own all the data (Excel, code scan reports, architecture diagrams). Then you bid out execution work to multiple vendors using our independent assessment - this saves millions vs letting the vendor doing assessment also do the rewrite.
Q6 Can you assess mainframe applications and COBOL code?
Yes. Legacy assessment covers all platforms: Mainframe (COBOL, PL/I, Assembler), Midrange (AS/400, RPG), Client-Server (PowerBuilder, VB6, Delphi), and Early Web (ColdFusion, Classic ASP). We have specialized tools and architects for each platform. If it's old and undocumented, we can assess it.