SimScale CAE Forum

Are we mining crypto?


#1

Possibly nobody noticed it: as software engineer a have a bias into computing efficiency and resources availability. I first noticed the following:

  1. Start Task Manager in Windows > tab ‘Performance’
  2. Start a SimScale meshing or simulation process with your browser.
  3. Observe CPU Usage, when moving back cursor to browser tab with SimScale running meshing (move then to different tab and back to SimScale tab). CPU usages is up, when you are in SimScale running meshing.
  4. Go to Task Manager > Processes tab. Click on CPU column header in order to sort CPU time (having active processes on the top). Move back to browser SimScale tab, observe Task Manager processes list.

Bottom line is: when you focus (move cursor) on SimScale simulation (meshing) window in your browser, CPU usage goes up by by at least 5%, despite the fact that no rendering is active (your hands on the desk).

You will find out, that meshing announces in the log, that it is done, but core hours are still incremented. CPU is consumed (if focus is on that window) till twice the time first meshing part (‘mesh ended’) is done. Example: meshing log said ‘done’ in 5 core hours, SimScale will tell you that meshing is done in 10 core hours. CPU activity starts immediately after notice from SimScale that meshing is ready.

In reality it is more complex. I use Chrome and have Nvida graphic board. Chrome knows how to communicate with my Nvidia by separate process. When running meshing with focus on SimScale simulation window, Nvidia is also used, at about 5-6% of capacity. I used different tools to look into that bizarre behaviour, and convince myself about ‘impossible’, but presented here only basic checks, everybody can reproduce.

Hypothesis:

  1. Lousy programming, forgotten threads, easter eggs, whatever.
  2. OpenFoam programs ‘rent’ remote computers, when they are kind enough to keep focus on simulation window.
  3. Crypto mining is running in the background, gently picking resources when using SimScale in meshing or simulation mode.

Here is the screen shot of Process Explorer, while I’m in a simulation. You can see two chrome processes at near 5%.


#2

Hi @Retsam,
interesting investigation. Before this starts getting into any sort of conspiracy theory - I can assure you that SimScale is not using any of your local computing resources for anything besides the base visualization of the workbench and the rendering etc. which is needed to visualize your geometries and post-processing results.
All computations for meshing and analysis resolution are fully done on cloud computing instances and none of your local processing units.
The post-processor also uses your local graphics card for enhanced rendering.
You can just confirm this by starting a mesh or simulation run and just closing your browser after that - you will see that these take the same amount to finish as if you would have your browser opened.
And of course we are not mining any crypto with your local computing resources! Any resources. At all.

The “5% increase in CPU usage” is probably due to some processes that checks for new results etc. which is triggered periodically if the user is active on the workbench - so you switching to the SimScale tab signals the platform that you are active and thus it checks for updates etc. You can also verify that by checking the network calls.

I hope that clarified your doubts - if not, I could probably add some more technical details on Monday.

Best,
Richard


#3

Hi Richard,

I really do not believe SimScale is making cloud computing using clients PCs (however it would be possible with client consent, like SETI “alien search” in the past). I checked network traffic, of course, and CPU and GPU period of activity (when meshing is running) is not correlated with any suspect network transfer. By the way, even Google is experimenting with new protocols, like QUIC, but it is still UPD transport, so ‘catched’ by standard network analyzer. There are other transfer protocols, which may go deeper and stay undetected. Hmmm.

But that silent network and busy CPU + GPU resembles schema of crypto mining on smartphones, as there is even no necessity to being connected to the network in order to mine a chunk.

Now, 5% of CPU (and possibly 5% of GPU when you have one) times 100000 clients gives 5000 CPU burning resources for nothing, as long as clients keep cursor on simulation window. If it is a program defect, I suggest to find it and fix it.

I see my GPU active when rendering results, but it is a GPU and takes a couple of seconds or less…

Cheers,

Retsam