Many finance teams in not for profits need to pull information from CRMs, Roster systems, Payroll systems, and Finance systems. It can often be slow and manual. Helpfully, there is now a lot of tech for finance and data.
Here are some tool classes finance teams are using for automating processes to free up time and shorten cycles, and generating insight as partners to the business.
With the functional overlap between tools, it can be hard to know where to begin, or whether you need them at all. If you want a handy reference to terms like RPA, Enterprise Mining, EPM, ETL, AI and, BI mean from an FP&A perspective, let's get started:
BI - Business Intelligence - Visual dashboards that are typically a rear-view mirror for the organisation. The term was first coined in early 90s by Howard Dresner (before that they were sometimes called EIS Executive Information System - which was a bit exclusive, or DSS Decision Support System - which was actually fairly easy to intuitively understand what you would use it for). BI often meant static report and OLAP cubes (Online Analytical Processing just means used for analysis, cubes are like persistent pivot tables) until new visualisation tools in the late 2000s started making analysis more fluid and drill down dashboards more interactive. Nowadays some BI tools use AI (described below) to provide recommendations on the data. Some EPM (described below ) does BI, but in different ways and sometimes EPM is run by Finance as the budgeting tool while BI is run by IT.
EPM - Enterprise Performance Management - a subset of BI helps add the planning and modelling to the data - Integrated Business Planning, Consolidation (sometimes consol is standalone tool), driver-based modelling, statistical and predictive forecasting - often multidimensional with an in-memory rules engine (the secret sauce of EPM - means you can quickly model any calc you might do in a spreadsheet and them some).
The flexibility to solve processes that involve people entering data, and workflow approvals (not just analysing data) means that many spreadsheet-based processes - where people manually enter, email, consolidate data - can be streamlined. That said, you can also streamline with RPA or a web capture tool - so what to use depends very much on the use-case and how you ultimately use the data.
A lot of time EPM is used for budgeting, but the name includes "Performance Management" for a reason. Done right, you can actually manage business performance - with a full cycle from choosing the metrics to run your business, planning the future and then providing feedback of actual performance to continually recalibrate from strategic plans down to operational execution. One nice thing about EPM is that it can add a level of consistency over a disparate business - "poor person's ERP" because you can acquire a company and pull through data for a consolidated reporting and forecasting (or sales analysis, or workforce planning etc) in a fraction of the time it would take to align all the core systems.
Finally, you get Workforce Analytics, CPM Corporate Performance Management, FPM Financial Performance Management SPM Sales Performance Management, BPM Business Performance Management and other flavours - but from a functional perspective they are much the same (not from a data and user perspective - some of them provide recommendations based on expert knowledge - eg a Workforce Engagement tool with specific recommendations to address issues, or provide industry benchmark comparisons) .
DW - Data Warehouse, Enterprise Data Warehouse - fashionable in the 90s and became cool again with a new suite of cloud databases that could scale up much more easily by separating compute from storage. Traditionally run by IT teams, a lot of the focus was squeezing data into the warehouse (the warehouse analogy serves well) with less thought on getting it out in a usable form.
Data Marts were just more focussed areas - often just the data structured in a different way (eg into 'star schema' where the Facts might be Sales transaction data - like sell price, volume by date while dimension data helps analyse this by eg product, from SKU all the way up the family tree, customer , time series, version, location etc -basically the same structure as OLAP cubes). Data Lakes (or lakehouse where you add some AI for good measure for our data scientists and data engineers) and Swamps (not a good thing) are where all types of data (not just numbers, maybe unstructured) might be thrown into first before getting more structured and organised in a warehouse (ELT vs ETL - ie load it first and then work out how to organise it).
Integration / ETL- Extract Transform Load tools to join the dots and pull data from one place to another and combine, sort, filter it along the way - sometimes a standalone used to 'orchestrate' data moving processes, sometimes embedded in any of the above tools which often include 'connectors' for applications. Often traditional ETL is run by IT in conjunction with a DW from which other tools like BI and EPM might then pull the data (except when the DW refreshes too slowly eg overnight and the finance team want to use the EPM during month end, in which case the EPM probably also connects straight to the ERP or GL (Enterprise Resource Planning and General Ledger). ETL can be used for batch processes to pull in structured data (eg your trial balance or daily sales) to streaming data sources.
RPA - Robotic Process Automation - free up a human from doing something mundane - eg login to a system, download a report, reconcile it, email someone if it doesn't balance. That said you could use an ETL to query the same systems, pull into your EPM tool for example and do the same check and email - so therein is the skill in knowing which tool to use. RPA can be used to streamline all or parts of a process like procure to pay, to being the last mile in sending our pdfs. Because many cloud APIs (application programming interfaces) have shifting standards, RPA can sometimes be a simpler way grab data - but you probably tend use it if there wasn't a native approach (eg to query the data directly from a database).
RPA has been extended with process and task automation tools that "automate both legacy applications, such as terminal emulators, modern web and desktop applications, Excel files, and folders. Interact with the machine using application UI elements, images, or coordinates." In other words, the glue between applications to streamline messy stuff (similar to how folks used tools VB and Access but with far more integration across hetrogenous systems and data, more structured, and less code).
Enterprise Mining - not enterprise mapping which was a lot of whiteboards and ruminations, enterprise mining does now what Business Process Re-engineering (BPR) was trying to achieve. It quantifies time spent in an operational process and helps pinpoint where to focus energy on streamlining it, with say RPA.
Financial Close - automate accounting workflows, centralise period-end accounting activities. Sometimes includes reconciliation, sometimes includes reporting - so yes, overlaps with consolidation and EPM tools abound - but you can it typically has a more operational financial accounting focus than FP&A.
AI/ML - artificial intelligence. Machine learning is a subset of AI. AI is good at some stuff, like classification, not so good at causality - something statisticians have been dealing with for decades. Causality means understanding intentionality using information or counterfactual dependence ("I am wet because it is raining"). While AI is sometimes hyperbolically used to describe traditional statistical techniques used in forecasting - eg taking a rolling average and projecting this forward - terms like "predictive" are fairly useful shorthand, regardless of technique. AI is moving from a standalone applications to something that is woven into the architecture of any tool, like electricity is to a home appliance, internet micro-services, or a database is to traditional software.
Analytics - tools that you use to do analysis - any combination of the above. It gets more complicated when analytics is described as: Descriptive (aka reporting - what happened), Diagnostic (aka analysis - why did it happen), Predictive (what will happen, what would I like to happen - planning, scenarios, what-if to provide options on what could be done), and Prescriptive (telling people what to do based on predictions - eg embedding it into an operational tool for decision support)
Spreadsheets - all of the above - taking in data capture, transformation, and storage, compute and derived calculations, and visual presentation - albeit with a lot of manual intervention, error checking and maybe email for distribution workflow/approvals.
There's a lot to learn to get sustained value from investments, so the best start small, work with experts to get started and focus on building capability not just delivering an output.