• 0

Code Level Runtime Analytics


Go to solution Solved by DevTech,

Question

wrack

Hi,

 

Being given a task to analyse and collect info on all code paths for a very very large monolithic .NET Web app that is used across various clients.

 

Thinking about adding in some reflection type code that basically collects data of run-time code path taken depending on the params passed to various methods.

 

Once I have the data, need to start removing/deprecating code that never gets used in few months in real world. Along with refactoring and adding more test coverage on major code paths to improve quality.

 

I am aware of unit testing approach but this app is 13+ years old and unit testing can't be just added, can't be easily refactored without breaking something, can't be easily re-written.

 

Are there any tools frameworks that can be integrated into existing code that would allow me to collect such data?

 

Not looking for code samples but if anyone has it then awesome. Any help, general guidance, software/framework recommendation would be highly appreciated.

 

Something like Google Analytics but for the actual source code itself.

 

TA :)

Link to post
Share on other sites

8 answers to this question

Recommended Posts

  • 0
DevTech
9 hours ago, wrack said:

Hi,

 

Being given a task to analyse and collect info on all code paths for a very very large monolithic .NET Web app that is used across various clients.

 

Thinking about adding in some reflection type code that basically collects data of run-time code path taken depending on the params passed to various methods.

 

Once I have the data, need to start removing/deprecating code that never gets used in few months in real world. Along with refactoring and adding more test coverage on major code paths to improve quality.

 

I am aware of unit testing approach but this app is 13+ years old and unit testing can't be just added, can't be easily refactored without breaking something, can't be easily re-written.

 

Are there any tools frameworks that can be integrated into existing code that would allow me to collect such data?

 

Not looking for code samples but if anyone has it then awesome. Any help, general guidance, software/framework recommendation would be highly appreciated.

 

Something like Google Analytics but for the actual source code itself.

 

TA :)

Plain old code profiling/analysis is a popular tool category since the dawn of .NET but to some extent, Roslyn has sparked a modern revolution in many .NET Tools.

 

1. Unit testing has NOTHING to do with any aspect of this

 

2. nanoRant: Constant Continuous Code Refactoring was the real useful business "take-away"  from Extreme Programming, not unit testing which is mostly a sick joke in the currently common watered down weak descendant of Extreme Programming techniques.

 

2. You can use AOP to instrument any large bodies of existing code. https://www.postsharp.net/aop.net

 

3. .NET has the most advanced compiler on Planet Earth in the form of Roslyn, so any tool that uses the code understanding features of Roslyn should be given a preference.

 

https://github.com/dotnet/roslyn

 

4. Here are a few starting points for you:

 

https://github.com/mre/awesome-static-analysis

 

https://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis

 

https://www.owasp.org/index.php/Source_Code_Analysis_Tools

 

https://visualstudiomagazine.com/articles/2018/05/01/vs-analysis-tools.aspx

 

https://www.sonarsource.com/products/codeanalyzers/sonarcsharp.html

https://github.com/SonarSource/sonar-dotnet

 

https://www.ndepend.com

 

 

  • Thanks 1
Link to post
Share on other sites
  • 0
wrack

Thank you. You are spot on on many things and the mess I have is inherited so very little I can do on unit test, continuous code refactoring etc.. Initiatives are also inherited so have no choice but to investigate the possibilities.

 

Let me tell you the most important out come I am after. There are a lot of QA issues and they only appear on UAT or even worse, post deployment :( What I am trying to achieve is find out the most hit areas of the code for different configurations (client configurations) and then get out QA teams to hit those areas as hard they can to sort out QA issues. In a long run the whole thing is going to be re-written but that would be few years and  I need results in 3-4 months.

 

So that said, I am actually looking for run-time recording and then analysis of method calls and parameter values so I can then generate a heat map for each client configuration and then work with QA team. Hopefully this gives a little more context.

Link to post
Share on other sites
  • 0
DevTech
57 minutes ago, wrack said:

Thank you. You are spot on on many things and the mess I have is inherited so very little I can do on unit test, continuous code refactoring etc.. Initiatives are also inherited so have no choice but to investigate the possibilities.

 

Let me tell you the most important out come I am after. There are a lot of QA issues and they only appear on UAT or even worse, post deployment :( What I am trying to achieve is find out the most hit areas of the code for different configurations (client configurations) and then get out QA teams to hit those areas as hard they can to sort out QA issues. In a long run the whole thing is going to be re-written but that would be few years and  I need results in 3-4 months.

 

So that said, I am actually looking for run-time recording and then analysis of method calls and parameter values so I can then generate a heat map for each client configuration and then work with QA team. Hopefully this gives a little more context.

Oh, that's simple then, just plain old Visual Studio 2019 stuff...

 

Dynamic tracing of executing code is a standard well tested and very useful feature of Visual Studio 2019

 

https://azuredevopslabs.com/labs/devopsserver/intellitrace/

 

Also useful to your area:

 

https://azuredevopslabs.com/labs/devopsserver/codeanalysis/

 

https://azuredevopslabs.com/labs/devopsserver/intellitest/

 

https://azuredevopslabs.com/labs/devopsserver/liveunittesting/

 

https://azuredevopslabs.com/labs/devopsserver/livedependencyvalidation/

 

https://azuredevopslabs.com/labs/devopsserver/releasemanagement/

 

 

 

 

  • Like 1
Link to post
Share on other sites
  • 0
DevTech

As a general observation, you seem to be describing a very large scale retrofit of running code with instrumentation payloads and re-architecture best introduced in a green field scenario.

 

Your system then becomes a high risk for Heisenbugs which can lead to a nightmare of "ring around the rosies"

 

https://en.wikipedia.org/wiki/Heisenbug

 

  • Like 2
Link to post
Share on other sites
  • 0
wrack
16 hours ago, DevTech said:

Oh, that's simple then, just plain old Visual Studio 2019 stuff...

 

We are still on VS2013. Have secured enterprise agreement to get VS2019 and few selected people have got it including myself but wide scale deployment is few months away.

 

Speaking to a senior architect said we have used New Relic before with amazing success but with financial system and our company's stance on not using Cloud tech (just yet) ruled the use of New Relic. We are in process of getting that ban reviewed.

16 hours ago, DevTech said:

As a general observation, you seem to be describing a very large scale retrofit of running code with instrumentation payloads and re-architecture best introduced in a green field scenario.

 

Another reason I am collecting the data is to identify heavy hitters and address QA issues with extensive test coverage (not unit but end to end scenario based) using our automated regression system.

 

Thank you for your help and guidance on this. Much appreciate it.

Link to post
Share on other sites
  • 0
DevTech
1 hour ago, wrack said:

We are still on VS2013. Have secured enterprise agreement to get VS2019 and few selected people have got it including myself but wide scale deployment is few months away.

 

Speaking to a senior architect said we have used New Relic before with amazing success but with financial system and our company's stance on not using Cloud tech (just yet) ruled the use of New Relic. We are in process of getting that ban reviewed.

Another reason I am collecting the data is to identify heavy hitters and address QA issues with extensive test coverage (not unit but end to end scenario based) using our automated regression system.

 

Thank you for your help and guidance on this. Much appreciate it.

Well obviously my thoughts have been very generic, but have a small useful attribute (maybe) of being an "outside viewpoint"

 

Each one of your replies drops an extra hint of a legacy system of very large size and complexity for which you deserve extra credit in seeking all viewpoints and ideas in your eval process.

 

So in that spirit, and to be complete as well in the spirit of excellent standard of due diligence you are exhibiting, I will point out a HUGE sea change in the design, architecture, deployment and real time delivery of modern enterprise (and anything large) applications to users and that is the Kubernetes revolution. At this point EVERY enterprise player has signed on to this architecture and it has arrived and will be considered as mandatory dial tone infrastructure within a few years, if not right now.

 

I point this out in your case because any establishment using wonderful .NET technology may have missed some of the signals and messaging around this architecture since on first glance it seems to be about some stuff a bit distant to .NET platforms, "Cloud" and Linux. Even if it is not possible to shoehorn a legacy system into the new way of doing things, there may be opportunities to build in compatibilities as you go along...

 

The standards around this architecture is run by the CNCF (Cloud Native Computing Foundation) (part of the Linux Foundation) and it can easily be missed that it describes the future of Enterprise Computing BOTH for Cloud and On-Premise and ALSO both for Linux and for Windows. Microsoft is a PRIMARY member of this foundation. There is no restriction on following a CNCF standard on local servers and with Windows technology. In fact some of the tech is already baked right into the Windows API.

 

Skipping all the crap in between, the beautiful result of twisting application architecture into many Docker Containers managed by Kubernetes is that the application becomes robust, scalable, hot deployable and most importantly for enterprise, Self Healing with zero downtime. Kubernetes manages the life cycle and moves containers around as needed by resource requirements, best fit, and demand loading. All the infrastructure is free OSS, can run on local servers and dev machines (well beefy ones...) and once working, scales with zero or little effort to larger local clusters or the Cloud since it is a standard supported by every Cloud provider.

 

The downside is a bit of head scratching to understand where to store state when the application containers are stateless (only way to get self-healing) and how to talk to your application when Kubernetes might have moved it anywhere!

 

Windows 10 and the latest Windows Server has native code built into the Windows API to support both native Windows Containers and Linux Containers. The latest version of .NET Core thrives in this flexible cross platform ubiquitous environment.

 

https://www.cncf.io

https://www.cncf.io/about/members/

https://landscape.cncf.io

https://www.docker.com/products/windows-containers

 

Windows Containers on Windows 10

https://docs.microsoft.com/en-us/virtualization/windowscontainers/quick-start/quick-start-windows-10

 

Linux Containers on Windows 10

https://docs.microsoft.com/en-us/virtualization/windowscontainers/quick-start/quick-start-windows-10-linux

 

 

CNCF_TrailMap_latest.png

CNCF_TrailMap_reduced.png

Link to post
Share on other sites
  • 0
wrack

Thank you again.

 

For the eventual rewrite MicroServices, Docker, Event Sourcing, CQRS, Snapshot DB etc, all avenues are being considered but that is for the chief architect to take care.

 

For the moment my focus is elsewhere.

Link to post
Share on other sites
  • 0
DevTech

Hey I'm sure that will work out well for you. Best wishes there...

 

If you want as you go along, feel free to throw out thoughts, inquiries, etc  - micro or macro...

 

A Note for other readers:

 

Something like a snapshot of a system or a database is very primitive compared to Container Self-Healing which is a kind of quantum leap first step towards a Holy Grail of computing. It works. It has not beaten down the doors of anyone's attention since it can be seen as "limited" in that it needs major changes to the architecture of things. It is a remarkable by-product of Docker Containers being stateless where an entire application image becomes the (huge) equivalent of a stateless HTTP request.

 

Normal stuff you expect in your VM server world that is missing in zillions of amorphous clusters of Dockers:

 

1. You need a Service Mesh to locate and talk to your App:

(your App moves around, changes IP address, adds copies of itself on demand load, etc)

 

Examples of CNCF Solutions:

https://linkerd.io  https://github.com/linkerd/linkerd2

https://www.getambassador.io  https://github.com/datawire/ambassador

https://www.envoyproxy.io   https://github.com/envoyproxy/envoy

https://traefik.io  https://github.com/containous/traefik

 

2. You most likely will need a file system composed of specialized Containers:

(your App can be destroyed, moved etc so like a HTTP request nothing is retained locally)

 

Examples of CNCF Solutions:

https://rook.io   https://github.com/rook/rook

https://www.openebs.io  https://github.com/openebs/openebs

https://min.io  https://github.com/minio/minio

 

3. You will need State management

 

- can be as simple as using Container Native Storage (#2 above)

- or a DB (ideally a CNCF standards compliant DB)

- or a "Serverless" API

 

but #3 is a more complex subject for another day...

 

But also, for once, the complexity ends up yielding a very real simplicity which is why Google, who invented Kubernetes is running BILLIONS of containers every day!

 

https://kubernetes.io

https://azure.microsoft.com/en-ca/services/kubernetes-service/

https://en.wikipedia.org/wiki/Kubernetes

 

 

  • Thanks 1
Link to post
Share on other sites
This topic is now closed to further replies.
  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By Usama Jawad96
      Windows Community Toolkit 7.0 is now live with a new MVVM library
      by Usama Jawad

      Back in 2016, Microsoft announced UWP Community Toolkit, a project that enables developers to collaborate and contribute new capabilities on top of the Windows 10 SDK. It simplifies some tasks that developers have to perform when building UWP and .NET apps. Microsoft rebranded it to Windows Community Toolkit in 2018 and has been rigorously updating it with new features in every release.

      Now, the company has announced version 7.0 of the utility, which contains a whole bunch of new features and enhancements.

      Microsoft has refactored numerous packages in order to decouple dependencies when only a small number of components are being utilized in the app. In most scenarios, this has resulted in an 80-90% decrease in footprint size of the application. Most of these changes affect the Animations and Controls packages, so it is advisable that developers refer to Microsoft's documentation here in order to identify any breaking changes to their apps.

      Windows Community Toolkit v7.0 also comes with a new MVVM library built on the foundations of platform-agnosticism, modularity, flexibility, and performance. The Microsoft.Toolkit.Mvvm package contains this library and since it targets .NET Standard, this means that it can be utilized on any platform that supports it, including UWP, WPF, Xamarin, and more. You can find out more about the new MVVM library here.

      Other notable changes include improved notification support for Win32 and .NET 5 apps, an improved ColorPicker, a TabbedCommandBar navigation interface, a new AnimationBuilder class, and SwitchPresenter to better organize your XAML.

      In terms of what's coming next, Microsoft says that it is releasing previews of the Toolkit which work with version 0.5 of Project Reunion Preview. Over the coming weeks, the company will also be releasing an updated version of WinUI 3 controls to NuGet for Project Reunion 0.5. These will be packaged in the "WinUI" root rather than "UWP", with the latter's packages being deprecated eventually. You can find out more about how to get started with Windows Community Toolkit here.

    • By Usama Jawad96
      Visual Studio 2019 version 16.9 now available, contains improvements for C++ development
      by Usama Jawad

      Microsoft has announced a bunch of developer-friendly features and tooling at the second part of its Ignite conference this year. Among these are a new version of Visual Studio 2019 in general availability as well as a tool which allows you to transition critical workloads to newer versions of .NET, such as .NET 5.

      Visual Studio 2019 version 16.9 was previously available as a preview, but is now generally available. It contains improvements for XAML and .NET productivity, tooling for Git, and C++ development. This also means that version 16.10 preview 1 is now available, also containing improvements for C++ and .NET.

      The .NET Upgrade Assistant is available as a preview too. It essentially combines the functionalities offered by various separate tools into a single unified tool so developers can upgrade their critical .NET workloads to newer platforms such as .NET 5 with ease. This is an open source effort.

      Microsoft is also making it easier for customers to deploy Java Enterprise Edition (Java EE) solutions on Azure. This is via scripts and offers available in Azure Marketplace, supported by Oracle, Microsoft, and IBM. In related news, the Private Azure Marketplace is now generally available too. This was announced as a preview back in September 2020, it enables IT admins to enable governance policies around which third-party Marketplace solutions can be accessed within their organization.

      The Redmond tech giant has also announced that Azure Communication Services will soon hit general availability. This was also announced as a preview in September 2020, and offers developers a collection of libraries to easily integrate communication channels across platforms and services. Microsoft claims it to be the "first fully managed communication platform offering from a major cloud provider". It is already interoperable with Microsoft Teams in preview and can also be utilized to enable conversational AI experiences over a traditional phone call.

      With regards to making it easier for developers to build stuff without worrying about underlying infrastructure, Microsoft has announced that Managed Virtual Network and Autoscale for Azure Spring Cloud are now generally available. The former allows for better security controls while the latter is better for cost-efficiency. Both these features were previously announced as a preview in September 2020.

      Check out our other Ignite 2021 coverage right here.

    • By Usama Jawad96
      The first preview of .NET 6 is now live: Here's what you need to know
      by Usama Jawad

      Microsoft announced .NET 5 a few months ago as the first step in the path to .NET unification. The goal is to have a single set of APIs, languages, and tools that you can utilize across multiple platforms. Today, the firm has unveiled the next stepping stone in this journey, which is .NET 6 Preview 1.

      The first preview of .NET 6 brings with it a raft of new features and capabilities. However, first and foremost is that it enables the next bits of .NET unification. Under this plan, while you can use .NET SDK to build mobile apps in Visual Studio and Visual Studio Code, the size of the SDK will actually be smaller because mobile workloads are optional. This capability will be gradually rolled out with .NET 6 releases and will be complete in .NET 7.

      With .NET 6, Microsoft is also leaning towards "open planning" so everyone is aware of the direction the company is headed in. This can be viewed in the Blazor-based app here which has multiple filters that allow you to see the plan most relevant to you.



      Additionally, .NET 6 comes with a new Multi-platform App UI built on top of Xamarin. It is a toolkit that allows developers to get a consistent view of their apps across various platforms, also allowing them to share code. Microsoft states that the focus during .NET 6 releases will be performance, control themes, and "faster developer experiences". Preview 1 currently includes support for Android and iOS. Windows and macOS will be supported in future releases.

      .NET 6 also includes support for developing Blazor desktop apps. This capability is primarily aimed at web developers who want to offer a feature-rich UI in offline desktop apps. Currently, Blazor desktop is being built for .NET apps, but Microsoft has stated that it can be used to build apps in other stacks like Swift as well. As can be ascertained, Blazor is built on top of the Multi-platform App UI, with focus being on providing performance similar to other desktop solutions.

      Another project that the .NET is working on goes by the name of "fast inner loop". The aim of this initiative is to enable faster build time and to develop capabilities that allow developers to skip rebuilding altogether, and just integrate code edits in live processes.



      With .NET 6, Microsoft is investing more in ARM64 support as well. Performance improvements are a key focus area in Preview 1, along with support for Windows Forms and Windows Presentation Framework (WPF). The development team also plans to add support for Windows Desktop app features in .NET 5 once it has enabled and tested them in .NET 6. With regards to Mac, initial support has been added for Apple Silicon ARM64 chips.

      Microsoft also plans to improve containers in .NET 6. Multiple ways to do this include reducing container image size, enhancing the scalability of containers, adding support for Windows process-isolated containers, and optimizing performance, among many others. Based on the current Linux landscape and release strategy, Microsoft has stated that images for .NET 6 will be based on Alpine 3.13, Debian 11, and Ubuntu 20.04. Once the company begins to release new .NET 6 images, this base image version will not change. Debian 10, which has been used as the image in multiple past releases, will be retired.

      The .NET command-line interface (CLI) also has a bunch of new experiences thanks to adoption of the System.CommandLine libraries. These include response files and Directives. Furthermore, math APIs and libraries have been added to .NET 6 too. It includes better support for Windows access control lists (ACLs) as well, with improvements to various relevant methods such as Semaphores and Mutex.

      The .NET thread pool has been redesigned to enhance portability. It will be the standard for .NET releases going forward, and will allow applications to have access to the shared thread pool, regardless of their runtime.

      A major part of .NET 6 Preview 1 is support for Apple Silicon. However, Microsoft has emphasized that this is currently in alpha stage. With this release, both ARM64 and x64 builds for macOS are being released. According to the company, this has been a major effort and as such, it does not plan to release ARM64 versions for earlier releases of .NET. Microsoft has also thanked Apple for all its support in bringing .NET 6 to Apple Silicon.

      That said, there are still some issues with the current release on Apple Silicon. Debugging native .NET apps doesn't currently work for any Visual Studio product. Microsoft plans to add support for this in Preview 3. Other known issues include:

      .NET has not been fully tested on Rosetta 2 emulation, but Microsoft has noted that this is a temporary bridge connected to ARM64 anyway, and will likely not be supported forever by Apple. The Redmond giant plans to support .NET on Macs on these older machines as long as Apple supports them.

      As stated, another focus of this release is also performance improvements. As such, .NET 6 Preview 1 brings enhancements to single file apps, single-file signing on macOS, hardware-accelerated structs, and dynamic PGO. It also includes Crossgen2 - a new iteration of the initial Crossgen tool - which allows for easier code generation and cross-generation development. Currently, the SDK defaults to Crossgen, but will be moving to Crossgen2 in future preview releases.

      .NET 6 will be officially released in November 2021, similar to how .NET 5.0 was released in the same timeframe last year. You can download .NET 6 by heading to this dedicated webpage and find out more details about it in the extensive blog post here. Microsoft has also stated that .NET 6 Preview 1 was tested on Visual Studio 16.9 Preview 4 and Visual Studio for Mac 8.9, so it is recommended that you use these configurations to test it for yourself.

    • By zikalify
      ISRG to begin making Apache HTTP Server more secure
      by Paul Hill



      The Internet Security Research Group, the organisation behind Let’s Encrypt, has announced that it’ll be making the Apache HTTP Server implementation of httpd more secure by incrementally moving core components from C to Rust to address memory safety issues. First, work on a new TLS module for httpd, called mod_tls, will be re-written using the Rustls library in place of OpenSSL.

      The ISRG’s Josh Aas said that he hopes the new mod_tls will someday replace mod_ssl that httpd currently uses as the default. The ISRG has secured funding from Google to contract Stefan Eissing – an httpd committer – to write the new module in Rust.

      The ISRG has decided to look at httpd because it is used on hundreds of millions of websites every day to serve requests. By fixing issues with it, the security improvements will have a broad impact. The most common types of issues affecting httpd are memory safety issues which arise due to the use of the C programming language being used to write the software. In the last decade, the Rust programming language has matured and by default only lets you compile memory-safe programs; by writing httpd components in this language you eliminate a whole host of vulnerabilities.

      Commenting on the role of C and Rust, ISRG’s Josh Aas said:

      One of Apache httpd’s founders, Brian Behlendorf, also commented on the project saying:

      The ISRG’s Let’s Encrypt project has already made a huge impact on the internet, ensuring that more sites can offer HTTPS connections to their users which provides a more secure experience.

    • By zikalify
      Raspberry Pi Pico has gone on sale for just $4
      by Paul Hill



      The DIY computer firm Raspberry Pi has announced its first microcontroller-class product, the Raspberry Pi Pico. The company said that the new $4 product is based on the RP2040 chip which it developed and that it’s ideal for “deep-embedded development” or as a companion to your Raspberry Pi computer.

      The new microcontroller is available for purchase now from one of Raspberry Pi’s approved retailers which can be found over on the product’s page. Alternatively, you can buy a copy of HackSpace #39 which comes with a free Pico as well as guides and tutorials that you can follow along with to put your new hardware to use.

      Discussing the RP2040, Raspberry Pi said that it had three design goals: high performance, flexible I/O and low cost. The Raspberry Pi Pico was described by the company as the RP2040’s “low-cost breakout board” and pairs the RP2040 with 2MB of memory. The specs for the Pico are:

      If you’re a beginner wondering what to do with the device, the Pico Getting Started page can help you to begin writing MicroPython and C code that can be executed on the Pico. The creator of MicroPython, Damien George, worked with Raspberry Pi to build a polished port for the RP2040 that exposes all of the chip’s hardware features. Support for RP2040 MicroPython was also added to the Thonny IDE by Aivar Annamaa.

      If you’ve never bought anything from Raspberry Pi but you’re curious about their products, the Pico seems like a really good place to start with its very low price tag.