Site logo

Expert CFD and Engineering Simulation Consulting

Fluid dynamics, CFD, science, HPC and industry views

Democratisation

The "democratisation of CFD" has been around for a while, and we've been simmering some thoughts on for most of this time. With the recent article on the democratisation of HPC, we thought the time was right to share our views…

First, lets be clear about the terminology. We're taking democratisation to mean freedom of access, the increased openness and accessibility of CFD, HPC, or maybe anything CAE related. The term originated since the provision of CFD capability inside CAD software (see Categories of CFD), and has grown with the wider access to open source packages and the accompanying amount of material to support their use freely available on the web, and more recently with the likes of SimScale, who package it all for you into a GUI you can drive from your web browser and do full on CFD on their public programme without spending a penny. In short, we think it's all about accessibility.

Now, the tricky part. Is it a good thing? Well, that very much depends on who you are. If you are someone interested in CFD and want learn more then the availability of relevant tools and software can only be a good thing. Having said that if you were really interested, and studied a relevant course then you would have had good access and exposure in that environment, which may have carried over to working somewhere in the field, with either a research or commercial focus. So probably not a really big deal for these folk. Our perception of the use of this term in the media is that the increased accessibility is a good thing for those who haven't studied or already focused in someway. There's an important maybe here.

Let's think about how you shouldn't take advantage of the accessibility unless you're starting with a base level of competence in the field. In a similar vein, over the past years there has been a "democratisation of poker" where you can now go and play at pub in arranged leagues and also very easily online. If you just want to log on, learn a bit, set yourself a sensible budget and have some fun, then great. If you need to make some money, but don't know your flush from your full house then rocking up with a bunch of money isn't going to end well. Ok, so the analogy is a little light hearted but the point holds - if you need some particular information from CFD, don't have a good base competence then being able to "easily" do a CFD simulation is nothing but very risky.

HPC has undergone a similar journey, with access to serious compute power now readily available on a pay per use basis. On
Engineering.com, they discuss the cost benefits of this approach in terms of commercial software use, with the move to using on demand cloud CPU hours offering decent cost reductions in the example of HPC strategy. For us the real gain here is when the access to HPC is coupled with open source tools with the latter taking the license costs out of the equation. Of course other costs pop in, like time from folk like us to help you get what you need form the open source toolset, but this will always be less (in our case) and provide a more flexible and responsive capability. OK, we're drifting off topic a little here - getting back on course, for us the real value of the openness and availability of CFD and HPC isn't so much that it's more democratised, rather that whatever level of complexity of your simulation needs, it's now more cost effective than it was before, easier to extend the complexity and faster to assess more designs, all with less hardware and logistics overhead.

So, if you want to take advantage of the lower cost base or extending your simulation capability, don't go all in on 7-2 off suit -
get in touch and we'll help you realise the potential.

Fit for purpose CFD

Understandably, a lot of effort goes in to making sure CFD gives accurate predictions, and of course there's a lot of scope for figuring out what accurate enough actually looks like. In this blog we'll discuss what we mean by fit for purpose CFD and how that cuts through a lot of work to get to answers that are useful in the shortest possible time.

For anything but the most straight forward of flows (in which case you may as well figure it out with pencil and paper), CFD takes some time and effort to provide useful output due to the complexities of the geometry involved, turbulence, multiple phases and so on. The important question is how much of that complexity is important you, and to know that you need ti be really clear about what information you are trying to get from a simulation. You might be faced with a flow induced vibration problem at certain points in your piping network and want to simulate how making some different pipe geometry choices will affect the vibration frequency. The flow is unsteady, turbulent and multiphase and you face some tricky choices of which models to use, and you don't really have time for thorough validation - you need to show your model captures the existing vibration frequency straight out of the oven and can therefore predict geometry changes effectively. If the phases in the flow are well dispersed, it may be sensible to model the flow as single phase with a representative density, removing the multiphase physics and simplifying the choice and implementation of turbulence model. The simulation can now deliver predictions in the timescales required. Now, the critical part here is to go back and look at the simulation data and analyse it to see if the single phase assumptions holds - are there any characteristics of the flow that suggest it will no longer be well dispersed, in the critical areas? Of course, if you don't match the measured vibration frequency then you know you've missed something, but if you do you need to be sure it's a well founded agreement, and that when you simulate the modified pipework you go back and check your data again for the validity of the assumptions.

For us, this is the essence of fit for purpose CFD - making helpful assumptions and constantly checking the data to see if they hold, and if they don't, working out how much they affect the information being used from the model - asking and understanding "what physics are important here?". In order to answer this it's critical to know whether the CFD is expected to give accurate quantitative information or qualitative shifts. In the vibration example above, the requirement might be to increase the vibration frequency above the structural resonance frequency, in which case a qualitative shift would suffice. It might be to increase the frequency away from the first structural resonance but not enough to run into the next mode, in which case the results must be sufficiently quantitatively accurate.

In the product development cycle, simulation delivers the best value early on, in the concept development and assessment phases where varied physical prototype testing is tricky and costly. A reduced set of physical tests linked to CFD through specific validation points allows the simulation to be well validated and deliver important performance characteristics while design philosophies are evolving. The timescales here are more favourable for detailed validation and the importance of real data for this purpose is often well appreciated given the nature of the decisions being taken early in the design cycle. Nonetheless, it's unlikely that the most physically complete simulation will be run for all design considerations, especially for complex products where the full set of physics would require long simulate times and extensive hardware performance. These days the cloud provides the hardware, but product development timescales are always pressured, so being able to run sub-models for specific aspects makes a lot of sense. Here our approach of constantly assessing the physics in light of the required information works well with regular referral to and modification of the experimental programme.

CFD is often required to solve problems later on in the design cycle, when architecture decisions have been made and there are some performance issues to resolve. This is challenging for simulation as the problems are often not well addressed by experiment, hence the requirement for simulation to provide some additional information not available from test. The opportunities for validation here are limited as it's the extra information from CFD that is required. Timescales are often tight as it's a problem solving scenario and fixes are required fast. The broad and flexible validation link to experimental programmes that's well placed in concept design and development often doesn't fit here. In this case its critical to understand exactly what information is required and use that to assess what physics are likely to be important and tailor the CFD to provide answers in as short a time as possible. It's in these sorts of cases where our experience in physics selection and continuous analysis adds real value to enable CFD to provide answers to difficult questions in challenging timescales.

If you have requirements for effective simulation applied at any stage of your product development process
get in touch and we'll use our experience of fit for purpose CFD to help.

Categories of CFD Software

There's a brief article on Engineering.com highlighting the Pros and Cons of 5 CFD Software Categories that makes interesting, if a little light, reading. The 5 categories are Open Source, through Open Source plus Wrapper, CAD-integrated, Specialised and Complete. There's quite a lot going on here, and worth digging a little deeper (and also worth having a look at that article first)….

The open source category, our particular favourite, is free at the point of use, assuming you have some hardware (or maybe just access to a web browser, which is all you need to use
SimScale, more of that later). It's not really free at the point of delivering some useful analysis, as you need to invest some time to learn it, or hire someone who knows how it works - of course assuming that you have chosen which software to use, which given the capability and complexity of some of the tool sets out there is not straight forward. Nevertheless, the lack of license costs is a clear advantage, although these can be offset by the cost of finding folk skilled in the art. For us one of the real value adds of open source is access to the source code and the availability of people skilled in its use and development (like us). This suits niche applications well where you can have some specific development performed for your particular application much faster and cheaper than through the other categories, which would almost always involve the software company doing the development, and prioritising it against their own roadmap - for SME's or bespoke requirements it's going to have to work hard to make it to the top of their list. The examples are CFD (finite volume: OpenFOAM and SU2) and also Lattice Boltzmann (LBM) based (Palabos). There's a lot of open source tools out there, so you can't mention them all, but the FEA open source equivalent to Comsol, Elmer, definitely deserves a mention. It could be labelled as "multi-physics" solvers as it can do fluids, solids, electromagnetics and a bunch of other stuff too, so maybe omitted for those reasons (although technically OpenFOAM is "multi physics" too).

We're not too sure about the interpretation of "wrapped" open source in the article, certainly in terms of suggesting support is poor. Taking the widely known and used finite volume differential equation solving toolset OpenFOAM as an example (it's
not just CFD), there are some GUI front ends such as HelyxOS from Engys that are very much do it yourself, but the commercial Helyx offering and Caedium from Symscape are provided by people who absolutely know their onions, and are able and willing to help you with yours. There's a subtlety here too: there's a lot of not wrapped, i.e. no GUI, developed flavours such as that from Caelus and the specific application versions from CFDSupport. Maybe OpenFOAM is a bit of a special case here, as it's had some good work for some time from a number of areas. It's interesting though that the examples provided are Caedium (already mentioned), SimScale, and Visual-CFD (from ESI). A particular mention here is deserved for SimScale which combines OpenFOAM, SU2 (a CFD specific, compressible flow and adjoint optimisation focussed delivery) and Calculix (a solid mechanics specific FEA solver with good contact modelling capability). All of these are open source, and SimScale "wraps" them up into a web browser. The SimScale guys also know their stuff, so maybe these examples are a bit off point.

CAD Integrated CFD gets a fair description in terms of pro's and con's, and very much on point with regards it not been the route of choice for analysts. There's another blog's worth of material on the topic of whether you need a designer, engineer or analyst to do your CFD for you - the subject of "Democratisation" of CFD - we'll get on to this in another blog.

The Specialised CFD category is a tricky one, which tries to identify a subset of what we would describe as commercial analyst level CFD. If there is a line it's certainly blurred; the Complete CFD category is described as being standard in the aerospace and auto industry, but Exa
PowerFlow is heavily used by Jaguar Land Rover in the UK, and that software is in the Specialised CFD grouping. Volkswagen and Audi are heavy users of OpenFOAM, too. The Commercial Analyst Level CFD certainly feels a better catch all description - granted within that you have tools that only offer specific functionality rather than the "everything you could need" level from the well know big players, but the only real disadvantage for the specific functionality offerers is just that. This category is worthy of some further detail, but any sub-categorisation doesn't really add much to the open source versus commercial benefits and limitations conversation, so we'll leave it there for now.

A finer point worth dropping in at the end is the use of "CFD Software" as a description, with some later mention of pre- and post-processing capability. The landscape across open source and commercial looks a little different when you start to think about pre- and post-processing tools - we'll be discussing these two other parts of the CFD process with regards open and commercial tools in another blog, coming soon….

Initial OpenFOAM Parallel Performance on Pi Cluster


There has been a few queries on the web about parallel scaling of OpenFOAM on Raspberry Pi’s after a good write up on the Pointwise blog following our first tweet on getting it installed. The graph below shows the scaling for up to four Pi’s on or 3D driven cavity test case. You can see linear scaling only up to 2 Pi’s. Important things to remember here are: OpenFOAM matrix solvers have different parallel performance, and the locations of partition boundaries also affects inter process communications. We’re planning a little project to look at these and other effects of parallel scaling in OpenFOAM. A small cluster of Pi’s is a great (cheap!) way to do this...

Screen Shot 2014-02-17 at 19.59.36

Raspberry Pi OpenFOAM Installation

As you can see, we’ve updated our website. We hope you like the new look. Below are the details of getting OpenFOAM up and running on Raspberry PI’s. It’s quick, fun, and educational so we thought we should keep on the site.

Hardware The hardware is as follows: 2x Raspberry Pi, 2x mains power adapters, 1x 8GB SD card borrowed from a DSLR camera, 1x 16GB SD purchased with “wheezy” Raspion OS pre-installed, 1x Netgear FS308 10/100Mbs network switch with mains adapter and a couple of cat5 cables. We also got a couple of transparent perspex cases for the PI’s - these are only for aesthetic purposes, but it is a good idea to get some type of case to give your Pi’s some protection.

OS Install For the 8GB SD card we downloaded “wheezy” Raspian OS from
http://www.raspberrypi.org/downloads and installed using a simple command line process below from http://elinux.org/RPi_Easy_SD_Card_Setup

>diskutil list
identify the disk (not partition) of your SD card. e.g. disk4 (not disk4s1)

>diskutil unmountDisk /dev/
e.g. diskutil unmountDisk /dev/disk4

>sudo dd bs=1m if=.img of=/dev/
e.g. sudo dd bs=1m if=2012-12-16-wheezy-raspbian.img of=/dev/disk4

This did take a while, due to buffering the image write. On a Mac its faster if you use

>sudo dd bs=1m if=.img of=/dev/<rdisk# from diskutil>
e.g. sudo dd bs=1m if=2012-12-16-wheezy-raspbian.img of=/dev/rdisk4

There’s a bunch of other GUI ways of doing the same thing.

The 16GB card came with “wheezy” pre-installed so none of the above necessary.

Start-Up & Config Both Pi’s were started by connecting to HDMi on a monitor. The Raspi-config screen greets you on boot. We changed the hostnames of the Pi’s to pi1 and pi2 and changed the passwords to the same simple text. We set the hostnames and specific IP addresses for the pi’s under the DHCP settings of the router for quick and easy remote access. Also we expanded the filesystem to occupy all of the SD card (it’s 2GB by default). At the LX terminal command line we ran

>sudo apt-get update
>sudo apt-get upgrade
>sudo apt-get dist-upgrade

to make sure the packages and wheezy distribution are up to date.

We then downloaded OpenFOAM compiled specifically for the Pi from http://rheologic.at/sites/default/files/downloads/RheologicRemix-2.2.1-Raspbian.tgz . A quick mention for the guys at Rheologic is required here as the Pi compilation is very stable (no issues with OpenFOAM or openMPI during any of the tests here) and they are very helpful and friendly.

We installed OpenFoam in the /opt directory and updated the OpenFOAM bashrc to point here (it’s one of the standard options). We then added

source /opt/OpenFOAM/OpenFOAM-2.2.1/etc/bashrc

to the top of our /home/pi/.bashrc file. It needs to be at the top before the commands that state if not running interactively don’t do anything because when running openMPI the call to parallel pi’s is not interactive, and we need the user .bashrc to have sourced the OpenFOAM bashrc.

Before running the OpenFOAM tests, we changed the Pi’s to boot to command line only with no GUI using

>sudo raspi-config

We didn’t test it explicitly but initial OpenFOAM runs looked to be significantly slower when the PI’s were running the GUI OS. All the tests below were performed by ssh’ing into the P’s from a Mac with XQuartz running as an X11 server.
Nov 2016
Oct 2016
Aug 2014