• 0

Code Level Runtime Analytics


Question

Hi,

 

Being given a task to analyse and collect info on all code paths for a very very large monolithic .NET Web app that is used across various clients.

 

Thinking about adding in some reflection type code that basically collects data of run-time code path taken depending on the params passed to various methods.

 

Once I have the data, need to start removing/deprecating code that never gets used in few months in real world. Along with refactoring and adding more test coverage on major code paths to improve quality.

 

I am aware of unit testing approach but this app is 13+ years old and unit testing can't be just added, can't be easily refactored without breaking something, can't be easily re-written.

 

Are there any tools frameworks that can be integrated into existing code that would allow me to collect such data?

 

Not looking for code samples but if anyone has it then awesome. Any help, general guidance, software/framework recommendation would be highly appreciated.

 

Something like Google Analytics but for the actual source code itself.

 

TA :)

Link to comment
https://www.neowin.net/forum/topic/1382564-code-level-runtime-analytics/
Share on other sites

8 answers to this question

Recommended Posts

  • 0
  On 07/05/2019 at 01:31, wrack said:

Hi,

 

Being given a task to analyse and collect info on all code paths for a very very large monolithic .NET Web app that is used across various clients.

 

Thinking about adding in some reflection type code that basically collects data of run-time code path taken depending on the params passed to various methods.

 

Once I have the data, need to start removing/deprecating code that never gets used in few months in real world. Along with refactoring and adding more test coverage on major code paths to improve quality.

 

I am aware of unit testing approach but this app is 13+ years old and unit testing can't be just added, can't be easily refactored without breaking something, can't be easily re-written.

 

Are there any tools frameworks that can be integrated into existing code that would allow me to collect such data?

 

Not looking for code samples but if anyone has it then awesome. Any help, general guidance, software/framework recommendation would be highly appreciated.

 

Something like Google Analytics but for the actual source code itself.

 

TA :)

Expand  

Plain old code profiling/analysis is a popular tool category since the dawn of .NET but to some extent, Roslyn has sparked a modern revolution in many .NET Tools.

 

1. Unit testing has NOTHING to do with any aspect of this

 

2. nanoRant: Constant Continuous Code Refactoring was the real useful business "take-away"  from Extreme Programming, not unit testing which is mostly a sick joke in the currently common watered down weak descendant of Extreme Programming techniques.

 

2. You can use AOP to instrument any large bodies of existing code. https://www.postsharp.net/aop.net

 

3. .NET has the most advanced compiler on Planet Earth in the form of Roslyn, so any tool that uses the code understanding features of Roslyn should be given a preference.

 

https://github.com/dotnet/roslyn

 

4. Here are a few starting points for you:

 

https://github.com/mre/awesome-static-analysis

 

https://en.wikipedia.org/wiki/List_of_tools_for_static_code_analysis

 

https://www.owasp.org/index.php/Source_Code_Analysis_Tools

 

https://visualstudiomagazine.com/articles/2018/05/01/vs-analysis-tools.aspx

 

https://www.sonarsource.com/products/codeanalyzers/sonarcsharp.html

https://github.com/SonarSource/sonar-dotnet

 

https://www.ndepend.com

 

 

  • 0

Thank you. You are spot on on many things and the mess I have is inherited so very little I can do on unit test, continuous code refactoring etc.. Initiatives are also inherited so have no choice but to investigate the possibilities.

 

Let me tell you the most important out come I am after. There are a lot of QA issues and they only appear on UAT or even worse, post deployment :( What I am trying to achieve is find out the most hit areas of the code for different configurations (client configurations) and then get out QA teams to hit those areas as hard they can to sort out QA issues. In a long run the whole thing is going to be re-written but that would be few years and  I need results in 3-4 months.

 

So that said, I am actually looking for run-time recording and then analysis of method calls and parameter values so I can then generate a heat map for each client configuration and then work with QA team. Hopefully this gives a little more context.

  • 0
  On 08/05/2019 at 06:08, wrack said:

Thank you. You are spot on on many things and the mess I have is inherited so very little I can do on unit test, continuous code refactoring etc.. Initiatives are also inherited so have no choice but to investigate the possibilities.

 

Let me tell you the most important out come I am after. There are a lot of QA issues and they only appear on UAT or even worse, post deployment :( What I am trying to achieve is find out the most hit areas of the code for different configurations (client configurations) and then get out QA teams to hit those areas as hard they can to sort out QA issues. In a long run the whole thing is going to be re-written but that would be few years and  I need results in 3-4 months.

 

So that said, I am actually looking for run-time recording and then analysis of method calls and parameter values so I can then generate a heat map for each client configuration and then work with QA team. Hopefully this gives a little more context.

Expand  

Oh, that's simple then, just plain old Visual Studio 2019 stuff...

 

Dynamic tracing of executing code is a standard well tested and very useful feature of Visual Studio 2019

 

https://azuredevopslabs.com/labs/devopsserver/intellitrace/

 

Also useful to your area:

 

https://azuredevopslabs.com/labs/devopsserver/codeanalysis/

 

https://azuredevopslabs.com/labs/devopsserver/intellitest/

 

https://azuredevopslabs.com/labs/devopsserver/liveunittesting/

 

https://azuredevopslabs.com/labs/devopsserver/livedependencyvalidation/

 

https://azuredevopslabs.com/labs/devopsserver/releasemanagement/

 

 

 

 

  • 0

As a general observation, you seem to be describing a very large scale retrofit of running code with instrumentation payloads and re-architecture best introduced in a green field scenario.

 

Your system then becomes a high risk for Heisenbugs which can lead to a nightmare of "ring around the rosies"

 

https://en.wikipedia.org/wiki/Heisenbug

 

  • Like 2
  • 0
  On 08/05/2019 at 07:16, DevTech said:

Oh, that's simple then, just plain old Visual Studio 2019 stuff...

 

Expand  

We are still on VS2013. Have secured enterprise agreement to get VS2019 and few selected people have got it including myself but wide scale deployment is few months away.

 

Speaking to a senior architect said we have used New Relic before with amazing success but with financial system and our company's stance on not using Cloud tech (just yet) ruled the use of New Relic. We are in process of getting that ban reviewed.

  On 08/05/2019 at 07:29, DevTech said:

As a general observation, you seem to be describing a very large scale retrofit of running code with instrumentation payloads and re-architecture best introduced in a green field scenario.

 

Expand  

Another reason I am collecting the data is to identify heavy hitters and address QA issues with extensive test coverage (not unit but end to end scenario based) using our automated regression system.

 

Thank you for your help and guidance on this. Much appreciate it.

  • 0
  On 08/05/2019 at 23:44, wrack said:

We are still on VS2013. Have secured enterprise agreement to get VS2019 and few selected people have got it including myself but wide scale deployment is few months away.

 

Speaking to a senior architect said we have used New Relic before with amazing success but with financial system and our company's stance on not using Cloud tech (just yet) ruled the use of New Relic. We are in process of getting that ban reviewed.

Another reason I am collecting the data is to identify heavy hitters and address QA issues with extensive test coverage (not unit but end to end scenario based) using our automated regression system.

 

Thank you for your help and guidance on this. Much appreciate it.

Expand  

Well obviously my thoughts have been very generic, but have a small useful attribute (maybe) of being an "outside viewpoint"

 

Each one of your replies drops an extra hint of a legacy system of very large size and complexity for which you deserve extra credit in seeking all viewpoints and ideas in your eval process.

 

So in that spirit, and to be complete as well in the spirit of excellent standard of due diligence you are exhibiting, I will point out a HUGE sea change in the design, architecture, deployment and real time delivery of modern enterprise (and anything large) applications to users and that is the Kubernetes revolution. At this point EVERY enterprise player has signed on to this architecture and it has arrived and will be considered as mandatory dial tone infrastructure within a few years, if not right now.

 

I point this out in your case because any establishment using wonderful .NET technology may have missed some of the signals and messaging around this architecture since on first glance it seems to be about some stuff a bit distant to .NET platforms, "Cloud" and Linux. Even if it is not possible to shoehorn a legacy system into the new way of doing things, there may be opportunities to build in compatibilities as you go along...

 

The standards around this architecture is run by the CNCF (Cloud Native Computing Foundation) (part of the Linux Foundation) and it can easily be missed that it describes the future of Enterprise Computing BOTH for Cloud and On-Premise and ALSO both for Linux and for Windows. Microsoft is a PRIMARY member of this foundation. There is no restriction on following a CNCF standard on local servers and with Windows technology. In fact some of the tech is already baked right into the Windows API.

 

Skipping all the crap in between, the beautiful result of twisting application architecture into many Docker Containers managed by Kubernetes is that the application becomes robust, scalable, hot deployable and most importantly for enterprise, Self Healing with zero downtime. Kubernetes manages the life cycle and moves containers around as needed by resource requirements, best fit, and demand loading. All the infrastructure is free OSS, can run on local servers and dev machines (well beefy ones...) and once working, scales with zero or little effort to larger local clusters or the Cloud since it is a standard supported by every Cloud provider.

 

The downside is a bit of head scratching to understand where to store state when the application containers are stateless (only way to get self-healing) and how to talk to your application when Kubernetes might have moved it anywhere!

 

Windows 10 and the latest Windows Server has native code built into the Windows API to support both native Windows Containers and Linux Containers. The latest version of .NET Core thrives in this flexible cross platform ubiquitous environment.

 

https://www.cncf.io

https://www.cncf.io/about/members/

https://landscape.cncf.io

https://www.docker.com/products/windows-containers

 

Windows Containers on Windows 10

https://docs.microsoft.com/en-us/virtualization/windowscontainers/quick-start/quick-start-windows-10

 

Linux Containers on Windows 10

https://docs.microsoft.com/en-us/virtualization/windowscontainers/quick-start/quick-start-windows-10-linux

 

 

CNCF_TrailMap_latest.png

CNCF_TrailMap_reduced.png

  • 0

Hey I'm sure that will work out well for you. Best wishes there...

 

If you want as you go along, feel free to throw out thoughts, inquiries, etc  - micro or macro...

 

A Note for other readers:

 

Something like a snapshot of a system or a database is very primitive compared to Container Self-Healing which is a kind of quantum leap first step towards a Holy Grail of computing. It works. It has not beaten down the doors of anyone's attention since it can be seen as "limited" in that it needs major changes to the architecture of things. It is a remarkable by-product of Docker Containers being stateless where an entire application image becomes the (huge) equivalent of a stateless HTTP request.

 

Normal stuff you expect in your VM server world that is missing in zillions of amorphous clusters of Dockers:

 

1. You need a Service Mesh to locate and talk to your App:

(your App moves around, changes IP address, adds copies of itself on demand load, etc)

 

Examples of CNCF Solutions:

https://linkerd.io  https://github.com/linkerd/linkerd2

https://www.getambassador.io  https://github.com/datawire/ambassador

https://www.envoyproxy.io   https://github.com/envoyproxy/envoy

https://traefik.io  https://github.com/containous/traefik

 

2. You most likely will need a file system composed of specialized Containers:

(your App can be destroyed, moved etc so like a HTTP request nothing is retained locally)

 

Examples of CNCF Solutions:

https://rook.io   https://github.com/rook/rook

https://www.openebs.io  https://github.com/openebs/openebs

https://min.io  https://github.com/minio/minio

 

3. You will need State management

 

- can be as simple as using Container Native Storage (#2 above)

- or a DB (ideally a CNCF standards compliant DB)

- or a "Serverless" API

 

but #3 is a more complex subject for another day...

 

But also, for once, the complexity ends up yielding a very real simplicity which is why Google, who invented Kubernetes is running BILLIONS of containers every day!

 

https://kubernetes.io

https://azure.microsoft.com/en-ca/services/kubernetes-service/

https://en.wikipedia.org/wiki/Kubernetes

 

 

This topic is now closed to further replies.
  • Posts

    • Apple is giving the upcoming iPad Pro a second front-facing camera by Taras Buria The M4-based iPad Pro brought a few significant changes to Apple's high-end tablet, such as tandem OLED displays, a much-thinner chassis, camera changes, and a much more powerful processor. Its successor, the M5-based iPad Pro, is rumored to retain the current form factor without major changes. However, there is one rather odd hardware update that is coming with the next iPad Pro. A new report says that a successor to the current iPad Pro lineup will offer a more powerful Apple M5 processor and more cameras on the front. While Apple experimented with a dual-camera setup on the back of the recent iPad Pros (this was killed in the M4 generation), the front of every iPad has always had a single camera, minus the original one, of course, which had none. With the M5 iPad Pro, Apple is rumored to double the number of front-facing cameras for a rather odd reason. No, Apple is not using a dual-camera setup for depth of field effects or a wider angle. According to Bloomberg's Mark Gurman, the logic is much simpler: satisfy fans of the portrait orientation. The problem is that in the 2024 iPad Pro, Apple moved the front-facing camera to the longer side of the tablet, where it makes much more sense for FaceTime calls, selfies, and everything else. However, that makes the iPad a bit awkward to use when in portrait mode, especially when it comes to FaceID. Now, it appears that Apple wants to make both camps happy by adding another front-facing camera to the shorter side of the screen. There is no information on whether we will see just one more front-facing camera or the entire FaceID module. Given Apple's nature of blaming users for some of its device shortcomings (the infamous "you are holding it wrong" line), it is quite interesting to see Apple addressing a seemingly minor concern with such an overkill solution.
    • Hello! It's default behavior. I assume that F:\ and E:\ are external drives? My local drives are under This PC. So File Explorer is showing storage from different locations: This PC (under which are local drives) Mapped Network drives External USB drives The Network (under which, my NAS) You could drag the drive to Quick access to see it all the time, but in my case when I expand This PC, the local drives remain in view even when I close and reopen the window.
    • It also lost Window Share support, so can no longer share websites from Edge to it or files from Explorer using the Share button or photos from Photos app. Mind you, not that this is impossible from a webappp since the new Outlook does support Share and that's a web app.
    • Was that just a warning about malware in general, or is there a list of malware that's proven to affect a computer outside a sandbox/vm?
    • One of the best file managers for Windows 11 gets brand-new Omnibar and more by Taras Buria Files is a popular file manager for Windows 10 and 11, and it is one of the best alternatives to the stock File Explorer. The app is packed with features, it looks fantastic on Windows 11, it is free, and it regularly receives feature updates. The latest preview update, version 3.9.12, introduces some big changes to the address bar, search, and filtering. With Files Preview 3.9.12, the app is getting the new Omnibar. It is a new control that works the same as your browser's address bar. It combines the "breadcrumb" path and the search box into a single UI element. By default, the new Omnibar displays breadcrumbs, which are the current path to the folder with quick access to each directory in the path and their nested folders. Besides the visual overhaul, the new Omnibar introduces a new Home button with a dedicated flyout that lets you navigate to quick access items, drives, and other elements. You can click the new Omnibar (or press Ctrl + L) to edit or copy the current path, paste a new address or navigate to another folder by typing its location. The Omnibar also now hosts the Command Palette, which you can trigger by pressing Ctrl + Shift + P. With the Omnibar, Files no longer uses a dedicated search bar. You can search for files by pressing Ctrl + F or clicking a search button in the omnibar. It is also worth noting that developers changed how filtering works. Now, Files has a dedicated filter UI instead of defaulting to filtering items when typing in the search box. It works faster and it is more intuitive. Other changes in Files Preview 3.9.12 include OX Drive integration and partial RTL support. You can download Files Preview 3.9.12 from the official Files.community website. If you want to support developers, you can purchase the preview version from the Microsoft Store.
  • Recent Achievements

    • Rookie
      Snake Doc went up a rank
      Rookie
    • First Post
      nobody9 earned a badge
      First Post
    • One Month Later
      Ricky Chan earned a badge
      One Month Later
    • First Post
      leoniDAM earned a badge
      First Post
    • Reacting Well
      Ian_ earned a badge
      Reacting Well
  • Popular Contributors

    1. 1
      +primortal
      497
    2. 2
      Michael Scrip
      206
    3. 3
      ATLien_0
      201
    4. 4
      Xenon
      136
    5. 5
      +FloatingFatMan
      117
  • Tell a friend

    Love Neowin? Tell a friend!