2024-12 Security & Audit Check

Hi all! Welcome to the end-of-year goody that we traditionally hand out to guarantee you have something to celebrate at the end-of-year party! This time, I wish to introduce to you a full vulnerability check of your Db2 for z/OS systems.

DORA!

You should all be aware, and scared, of DORA by now. If not, read my prior newsletter 2024-11 DORA or check out my IDUG presentation or my recorded webinar. Whatever you do, you must get up to speed with DORA as it comes into force on the 17th Jan 2025 which is only one month away from the publishing date of this newsletter!

Not just Born in the USA!

Remember, DORA is valid for the whole wide world, not just businesses within the EU. If you do *any* sort of financial trading within the EU you are under the remit of DORA, just like you are with GDPR! Even if you are not trading within the EU block, doing a full vulnerability check is still a pretty good idea!

PCI DSS V4.0.1

We also now have the Payment Card Industry Data Security Standard (PCI DSS) V4.0.1 going live at the end of March 2025… Coincidence? I don’t think so. Mind you, at least the Americans do not fine anyone who fails!

What are We Offering?

This year, the product is called the SecurityAudit HealthCheck for Db2 z/OS or SAC2 for short. It is a very lightweight and robust tool which basically does all of the CIS Vulnerability checks as published by the Center for Internet Security (CIS) in a document for Db2 13 on z/OS:

CIS IBM Z System Benchmarks (cisecurity.org)

https://www.cisecurity.org/benchmark/ibm_z

This contains everything you should do for audit and vulnerability checking and is well worth a read!

Step-By-Step

First Things First!

The first thing SAC2 does, is evaluate *all* security-relevant ZPARMs and report which ones are not set to a „good“ value. It then goes on to check that any default values have not been left at the default value. This especially means the TCP/IP Port number, for example. Then it finishes off by validating that SSL has been switched on for TCP/IP communications and that any and all TCP/IP ALIAS defs also have the correct settings.

Communication is Important!

Next up, is a full evaluation of your Communication Data Base (CDB). This data has been around for decades and started life for SNA and VTAM connections between Host Db2s. These days, SNA is dead and most of the connections are coming from PCs or Servers. That means that there *could* be a lot of dead data in the CDB and, even worse, ways of connecting to your mainframe that you did not even know, or forgot, existed! Think plain text password with SQLID translation for example!

Danger in the Details

Naturally, blindly changing CDB rows is asking for trouble, and if SAC2 finds anything odd/old/suspicious here, you must create a project to start removal. There is a strong correlation between „Oh I can delete that row!“ and „Why can’t any of my servers talk to the mainframe anymore?“. The tool points out certain restrictions and pre-reqs that have to be done *before* you hit the big button on the wall! JDBC version for example.

Taking it All for GRANTed?

GRANTs can be the root of all evil! GRANT TO PUBLICs just make auditors cry, and use of WITH GRANT OPTION makes them jump up and down. Even IBM is now aware that blanket GRANTing can be dangerous for your health! SAC2 analyzes *all* GRANTs to make sure that PUBLIC ones are discovered on the Catalog and Directory as these should NEVER be done (with the one tiny exception of, perhaps on a good day when the sun is shining, the SYSIBM.SYSDUMMY1), then further checking all User Data as PUBLIC is just lazy. Checking everything for WITH GRANT OPTION is just making sure you are working with modern security standards!

Fun Stuff!

These days you should be using Trusted Contexts to access from outside the Host. This then requires Roles and all of this needs tamper-proof Audit Policies. On top of all this are the extra bits and pieces of Row Permissions and Column Masks. All of these must be validated for the auditors!

Elevated Users?

Then it lists out the group of privileged User IDs. These all have elevated rights and must all be checked in detail as who has what and why?

Recovery Status Report

Finally, it lists out a full Recovery Status Report so that you can be sure that, at the time of execution, all of your data was Recoverable.

It is a Lot to Process!

It is indeed. The first time you run it, you might well get tens of thousands of lines of output but the important thing is to run it and break it down into little manageable sections that different groups can then work on. This is called „Due Diligence“ and can save your firm millions of euros in fines.

Lead Overseer

Great job title, but if this person requests data then you have 30 days to supply everything they request. Not long at all! SAC2 does the lion’s share of the work for you.

Again and Again and Again

Remember, you must re-run this vulnerability check on a regular basis for two major reasons:

  1. Things change – Software, Malware, Attackers, Defenders, Networks, Db2 Releases etc.
  2. Checks get updated – The auditors are alway looking for more!

Stay Out of Trouble!

Register, download, install and run today!

I hope it helps you!

TTFN

Roy Boxwell

Future Updates:

The SAC2 licensed version will be getting upgraded in the first quarter of 2025 to output the results of the run into a Comma Separated File (CSV) to make management reporting and delegation of projects to fix any found problems easier. It will also get System Level Backup (SLB) support added. SLB is good but you *still* need Full Image Copies! Further, it will be enhanced to directly interface with our WLX Audit product.

2024-11 DORA

A title that should strike fear into the hearts of all readers! Nah! I really hope not! This month, I wish to run through the parts of DORA that impact the Db2 for z/OS world mostly…

What is DORA?

Dora is the Digital Operational Resilience Act and it was created on the 14th of December 2022 with an enablement date of 17th January 2025, giving us all over two years to read it, understand it, action the requirements from it, and accept it fully into our computing culture.

How many of those things have you accomplished in the last two years? You mean you had a day job as well?

Not just Born in the USA!

DORA is valid for the whole wide world, not just businesses within the EU. If you do *any* sort of financial trading within the EU you are under the remit of DORA, just like you are with GDPR!

PCI SSC DSS new update!

As well as DORA, there is a „new“ version of Payment Card Industry Security Standards Council Data Security Standard 4.0.1. It comes into force on the 31st March 2025 and it contains a lot of overlap with „our“ DORA here in the EU! Here’s a link to their website.

What is the Aim of DORA?

The idea is to bring together all the disparate EU regulations into one new regulation for nearly every financial trading house (FINTEC), apart from a few exclusions e.g. microenterprises, thus simplifying the requirements and easing audit and control.

Is it Just a New Level of Audit?

Most definitely not! As the name suggests, Digital Resilience is all about surviving an attack, or a disaster, and being back and processing data (money) as soon as possible. Included within is indeed a bunch of auditable things, but I will get to them later in this newsletter.

What does DORA Cover then?

Security, Operations, Recoverability and Test. Not only these, but these are, at least for me, the biggies. The last of them – Test – is incredibly important in what they mean by „Test“. They mean Performance Test, Reliability Test and Vulnerability Test. These are not all new for us but some are. We all remember GDPR and friends, where types of data had to be „respected“ otherwise you got a hefty fine. Now, with DORA, the way you work, run, update and check your systems must all be proven and reported. If you do not deliver you get – guess what? – hefty fines!

Who’s the Boss?

You might have read about this, or seen a presentation, but the absolute „boss“ for this regulation is the English PDF version here:

All other versions are translated and so may have errors.

Just the Facts, Ma’am

Well, no, I am not actually going to list all the facts of DORA here, but the highlights for me are the following opening paragraphs:

46 Mandates vulnerability testing

48 Maintained systems (Current release/PTF/APAR etc.)

49 & 50 Recoverability and RTO

56 Performance, Testing and Scanning

This is a brave new world for lots of us! One of the buzzwords is ICT which is basically IT.

Book, Chapter and Verse

Chapter 2 Section II Article 6 ICT risk management framework Paragraphs 2, 4 & 6

This is all about risk management and covers unauthorized access, segregation of duties and regular internal audits.

Chapter 2 Section II Article 8 Identification Paragraphs 1, 3 & 7

Hammers home the requirement for regular updates and risk assessments after major changes.

Chapter 2 Section II Article 9 Protection and prevention Paragraphs 1, 2, 3 & 4

This is pure audit: Monitor continuously what is happening, get tooling and policies in place. Make sure that all data is secure at rest, in use and in transit. Validate authenticity and guarantee strong authentication.

This is the most important part for me – It means encrypt at rest, use SSL for *all* things remote, do not use technical user ids with passwords, use Certificates and/or use Trusted Contexts and implement at least MFA for all users. The „in use“ part is a bit crazy, especially for Db2, but I am pretty sure that the Bufferpool is „trusted“ and so we, as z/OS users, can ignore this part for now…

Chapter 2 Section II Article 10 Detection Paragraphs 1 & 3

Detect whether weird stuff is happening and monitor user activity!

Chapter 2 Section II Article 11 Response and recovery Paragraphs 1 & 3

Make sure you have the ability to recover quickly, and in a timely manner, making sure you have response and recovery plans all ironed out.

Chapter 2 Section II Article 12 Backup policies Paragraphs 1 & 2

Guarantee that your image copies are enough and available to minimize downtime and limit disruption. These must be regularly tested!

Chapter 4 Article 24 Testing Paragraphs 1 & 2

Test! Test and test again – at least yearly!

Chapter 4 Article 25 Testing of ICT tools and systems Paragraphs 1 & 2

Performance Testing, Penetration Testing and Vulnerability Testing.

Chapter 4 Article 26 Advanced testing and TLPT Paragraphs 1, 2 & 6

More advanced testing including a Thread-Led Penetration Test on live production systems at least every three years! All must be documented of course…

Chapter 5 Article 35 Powers of the Lead Overseer Paragraphs 1, 6, 7 & 8

Lead Overseer – I get this mixed up with Supreme Leader all the time…

The Lead Overseer is the DORA God in a given country and can simply ask for any of the details I have just listed. Failure to deliver the goods within 30 days (Calendar days, not working days!) will result in fines…

I am Fine!

Well, the fines are pretty nasty… The Lead Overseer can fine any firm up to 1% of the average daily turnover from the previous financial year. This is then compounded by the fact that the Lead Overseer can levy this fine *every* day for up to six months!!! Ouch!!!

An Example Calculation

Taking a large bank as an example, just to show the math. The turnover in 2023 was nearly 60 billion euros. Divided by 365 gives us 164 million euros per day. Taking 1% of this (worst case) gives 1.64 million euros. Assuming the Lead Overseer is being especially nasty and levels the fines for 182 days leads to around 298 million euros in fines.

I, for one, do *not* want to be the first firm in Europe getting this… and it is all in the public domain which is then a massive image loss as well!

What can I do?

Well, first up, make sure all data is encrypted at rest – This should be a no-brainer due to modern disks/SSDs anyway.

Then, make sure that *all* remote access is using the SECPORT and is being encrypted in flight – again, this should be easy but remember to then set the PORT to be the same as the SECPORT which then forces all of this. Do not forget to check your TCP/IP ALIASs!

Do a full recoverability test to guarantee that you have all the Image Copies, Logs, Archive logs that you require to actually do a full recovery. If you can also meet your RTOs here then even better! Here our RealTime DBAExpert (RTDX) software can really help out, with timely Image Copies and a verification of complete recoverability.

Audit your Db2 Systems!

Do a Vulnerability Test on your Db2 Systems!

Audit?

I have done a lot of blogs and Webinars just about Audit, so I will spare you the details, but you must actually get it done. You will almost certainly require an external auditor who does a final check/validation that your audit is good to go and then you are done. Here, our excellent WLX Audit product can suddenly become your best friend!!!

Feeling Vulnerable Today?

The Center for Internet Security (CIS) has released a document for Db2 13 on z/OS:
CIS IBM Z System Benchmarks

It contains everything you should do for Audit and vulnerability checking and is well worth a read and then action the reports within.

Docu Docu Docu

All of these things must be performed, documented and repeated on a regular basis and sometimes even after a simple „change“ has occurred.

The world is a bad place and DORA is here to help us really, but the start will, as always, be a hard climb!

TTFN,

Roy Boxwell

2020-04 Four Flavors of Db2 Audit

These days there is a lot of talk about audit, specifically regarding Db2 on z/OS. So, in this newsletter, I wish to run through four different ways that you can “Get Audit Done”.

As well as simply getting it done, I will also run through the four different ways that you can process the gathered data.


Four ways to get a Db2 z/OS Audit done


1- First up

First option is the simplest, cheapest and quickest:

Do nothing.

Whether or not this will help your company is a non-trivial question of course!

Naturally this is an absolute No No.


2- Then we have

Next option is relatively simple and cheap, but requires a bit of work: 

Write it all yourself but based on existing data that some other process already extracts for you, (SMF for example). 

If you happen to have the skills for extracting the required audit data from existing data that is being collected anyway, then this might well be the best method if you are really strapped for resources. 


3- Getting there 

Then we have not so simple, still cheap, but a ton of work: 

Write it all yourself and add all the IFCIDs you actually need to audit your system as well as capturing all the SQL. 

This needs a serious amount of skills to get and keep up with the agile world of Db2. You will also need to take care of the amount of data that you will be collecting.

However, the auditor will be happy as you have everything they could ask for.


4- Aha! The only true way 

Last option is simple, not so cheap but very quick: 

Third party software that does it all for you.

This is my preferred solution, especially as we just happen to sell one (WorkLoadExpert Audit).

This is actually the only real way to go. You probably don’t have the time to keep all these things up-to-date and running correctly. 

Data Collected – Now what? 

So, you have chosen one of these ways to gather the data. Now you must evaluate what you got. Here again we have four separate ways to go forward: 

First up 

There it is! 

Do nothing. Just point at the datasets, print outs, database objects and say “It is all in there…” 

This is not really a solution and any auditor worth his, or her, salt would quite rightly be extremely upset! 

Then we have 

A whole bunch of pre-written SQLs. 

SPUFI is ok, but much better would be to see these in a GUI where graphical viewing is built in and saving and sharing results is much easier.  

This is not bad, but still a manual “island” process. Just Db2 and nothing else plus it must be triggered by humans. 

Getting there

A whole bunch of pre-written and custom SQLs.

This time, all run in Batch and the results are emailed to the auditor directly. These emails can “just sit there” until the auditor checks the results. Naturally, if anything is found, then the underlying data must still be there for a detailed analysis.

Better, as it is getting automatic but still not really “round”, as it is still Db2 in isolation…

Aha! The only true way

Use of LEEF or SYSLOGGER-style formats to export all audit data.

The data is then in a data-lake where SPLUNK, QRADAR et al can happily slice and dice their way through the data.

This is the best way!

You also get an extra bonus point for *removing* the data from the mainframe. As auditors *love* a single point of control, this is the only real way forward. It also pushes the Db2 data into the world of other data that auditors use and require.


Db2 Audit with „GIVE&TAKE“ :


Software Engineering GmbH and SEGUS Inc are launching a new free Give&Take which this time is the Audit support from WorkLoadExpert.

If you would like to take part, then please just fire off an email to db2support@segus.com telling us who you are and which firm you work for and we will get in touch!

Give and Take 

By the way, it is called “Give&Take” because :

  • we Give you the software, for free, to run for a trial period, and
  • we would like to Take away what you think, feel, and find about the software after the trial period. 

More about Give&Take


TTFN, 

Roy Boxwell 

2018-06 – DST Db2 timestamp problems: I really hate Daylight Saving Time

How to avoid timestamp problems while going from winter to summer time in a Db2 for z/OS system ?

Is the CHAR or Timestamp use, the safest timestamp procedure ?

How do you fix it?

This year, as every year, the moment arrives for most of us when the clocks go forward and then in autumn back again. I really hate this, as I still have a bunch of clocks that do not automatically do it for me. My PC, phone, laptop, TV etc. all do it for me but the rest… anyway what has this got to do with Db2 I hear you all wonder? Well it really is quite a horrible little story coming up…

Same procedure as every year

At precisely 02:00 on the 25th of March a SET CLOCK console command was issued to change the UTC Offset to +2 thus leaping from 02:00 to 03:00 in the blink of an eye.

How long?

Now “how long” is the blink of an eye? For Db2 these days – too long!

Day of reckoning

At 2018-03-25-02.00.00.006999 a transaction was logged in the Audit system, in fact *lots* of transactions were in-flight at this time. Normally it is not a problem and, in fact, nothing happened until nearly three months later when someone found that there was possibly some data missing.

Alarm!

Alarm bells are ringing as these inventory checks cannot have missing data. The code is nowadays all JAVA and the developer in charge of the problem found out that the data was indeed missing!

Oh no it isn’t!

The DBA group were then involved, as it could be data corruption, and they looked and found the data – but it was not the same data as the developers had… then the penny dropped!

Clever old JAVA

In fact, the data the developer had was *exactly* one hour later than the data found by the DBA group. I mentioned earlier that the 25th March was the switch to summer time and, perhaps, the JAVA Driver is “helping” us, a bit too much help if you ask me!

Date Check

Here is a bit of SQL for you to recreate the problem and gaze in wonder at how cool/horrible (delete what is not applicable) JAVA really is.

CREATE TABLE BOXWELL.DAY_LIGHT                           
  (COL1 SMALLINT     NOT NULL                            
  ,COL2 TIMESTAMP    NOT NULL)                           
;                                                         
INSERT INTO BOXWELL.DAY_LIGHT                            
VALUES (1 , '2018-03-25-01.59.59.999999');               
INSERT INTO BOXWELL.DAY_LIGHT                            
VALUES (2 , '2018-03-25-02.00.00.006999');               
INSERT INTO BOXWELL.DAY_LIGHT                            
VALUES (3 , '2018-03-25-03.00.00.000099');               
COMMIT ;                                                 

SELECT * FROM BOXWELL.DAY_LIGHT                           
ORDER BY 1                                               
;
---------+---------+---------+---------+---------+--------
  COL1   COL2                                         
---------+---------+---------+---------+---------+--------
     1   2018-03-25-01.59.59.999999                     
     2   2018-03-25-02.00.00.006999                      
     3   2018-03-25-03.00.00.000099                      
DSNE610I NUMBER OF ROWS DISPLAYED IS 3

The output of SPUFI looks great! Timestamps are correct and all is fine.

It is a GUI world

Now do the select using a JAVA driver of your choice, here I am using DataStudio:

DST Db2 timestamp problems - Char - Daylight Saving Time

And then running it gives:
DST Db2 timestamp problems - Char- Daylight Saving Time

Spot the difference!

Isn’t that

great/terrifying (delete what is not applicable)

as the JAVA driver is “looking” at the timestamp data and seeing “oh oh! That timestamp is impossible! I know – I will add one hour to correct it!”

This scares me a little…actually quite a lot!

Docu – What Docu?

The only place I could find anything about this was in a chapter about not using 24 as midnight and the problem of using timestamps between October 5th 1582 and October 14th 1582:

https://www.ibm.com/support/knowledgecenter/en/SSEPEK_11.0.0/java/src/tpc/imjcc_r0053436.html

If you read it you can find this one sentence:

If a string representation of a date, time, or timestamp value does not correspond to a real date or time, Java adjusts the value to a real date or time value.

Which, of course, explains everything!

The quick fix…

There is no quick fix!

1 – The customer must either change all SQL to use the CHAR function – Not good!

Or

2 – Check all of their important timestamp columns for the range 02.00.00.000000 -> 02.59.59.999999 data and then update them with plus one hour – Not good!

Faster and Faster : the best fix ?

This problem will get worse the faster the machines get and so my idea to solve it next year is simply issue a

SET LOG SUSPEND

at one second before 02:00 which flushes the log, issues a system checkpoint (non data-sharing), updates the BSDS and basically pauses the system. Then do the SET CLOCK command and then do a

SET LOG RESUME

It all takes about three seconds and so should not cause any timeouts.

 

I really hope that, one day, we simply get rid of daylight saving time…

 

As always, any questions or comments would be most welcome!

TTFN,

Roy Boxwell

2018-05 Audit 2.0 – GDPR Audit guide and checklist for Db2 z/OS

Or: How I Learned to Stop Worrying and Love the Auditor

A guideline in 5 big Steps to Audit for Personal Data Protection

First up, this is not another Audit review and health check sell! We, and I mean especially in the European Union (EU), are about to get the biggest update to “my data”, “my right to know” and “my right to forget (delete)” that has ever happened, when the General Data Protection Regulation (GDPRKicks off on May 25th, 2018.

Who cares?

Well, we all should really! Very soon hordes of people could well start demanding to know if you have any data about them and if so where it is and who you give it to. Personal Data will be big business!

Age of Consent

Just assuming all is ok is also no longer valid or lawful – any personal data that you keep must be kept with the *active* consent of the data owner… This is going to get awfully messy awfully quickly! Do not forget that the definition of personal data has also changed quite a bit e.g. IP Addresses. For the record here is the actual text in the GDPR regulations:

Art. 4 GDPR Definitions

For the purposes of this Regulation:

1.      ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

And

Art. 9 GDPR Processing of special categories of personal data

1.      Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.

Security by design and Security of processing

Very important are Articles 25 and 32:

Art. 25 GDPR Data protection by design and by default

1.       Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

2.       The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.

Last part of Paragraph two is important here.

Art. 32 GDPR Security of processing

1.      Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:

1.      the pseudonymisation and encryption of personal data;

2.      the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;

3.      the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident;

4.      a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

2.      In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.

Here paragraphs 1.2, 1.4 and 2 are the biggies!

A small or large fine for you today?

The fines are also amazingly high. First, for the “minor” problem of being over 72 hours late when data leaks have occurred (a breach), is 2% of global turnover or €10,000,000 – whichever is *higher* and, if you are really naughty, like disregarding basic data laws, moving data abroad or ignoring an individual’s rights then you get hit for 4% of global turnover or €20,000,000 – again whichever is *higher*

First up?

I, for one, do not want to be the first company that makes the headlines… How about you?

Who are you afraid of?

Now I know that most of the talk about GDPR is “right to data” for the general public but we all know, as IT specialists, that the people to be really afraid of are ex-employees who did not leave on happy terms. They know *exactly* where the knife could best be put… They know exactly where, and on which platforms, all of the data really is.

Protect yourself – Due diligence

What can you do? Well the first thing is to actually understand what GDPR means to you and your firm’s data. Then you must be able to prove to auditors that you tried your very best. We will all probably get hacked at some time, but if you tried your best it is enough. Using all of the data you gather in a post-mortem, or for forensics, is also a very good idea as can be seen in paragraphs 2.3 and 2.4 below:

Art. 83 GDPR General conditions for imposing administrative fines

1. Each supervisory authority shall ensure that the imposition of administrative fines pursuant to this Article in respect of infringements of this Regulation referred to in paragraphs 4, 5 and 6 shall in each individual case be effective, proportionate and dissuasive.

 2. Administrative fines shall, depending on the circumstances of each individual case, be imposed in addition to, or instead of, measures referred to in points (a) to (h) and (j) of Article 58(2). When deciding whether to impose an administrative fine and deciding on the amount of the administrative fine in each individual case due regard shall be given to the following:

1. the nature, gravity and duration of the infringement taking into account the nature scope or purpose of the processing concerned as well as the number of data subjects affected and the level of damage suffered by them;

2. the intentional or negligent character of the infringement;

3. any action taken by the controller or processor to mitigate the damage suffered by data subjects;

4. the degree of responsibility of the controller or processor taking into account technical and organisational measures implemented by them pursuant to Articles 25 and 32;

The list goes on after these of course…

Db2 Audit – a new way?

Most sites use SMF and do daily cuts to then offload to a repository system of some sort on some sort of hardware. The problem here is that the amount of SMF data you are generating can swamp you and this method is not nearly quick enough. Waiting a day is 24 hours of the 72 that are available to you…

Faster, Better, Securer?

Can you do it faster?  Can you do it yourself faster?  Can you do it better? Maybe…

 

If you can handle OPx and a bit of High Level Assembler then “Bob’s your uncle!” as long as Bob is your Auditor of course…There are more than enough examples in the web about how to write and do OPx reading (There are even vendors that are willing to sell you their software 🙂 )


A guideline in 5 big steps to Audit in a new way for Personal Data protection :


1 – To do list – IFI commands, IFCIDS & Audit Class

Create a program that can issue the required IFI commands to start and stop traces and to issue the READA and READS IFI calls that you wish to have. If you don’t know which Audit Class is which IFCID – Here’s my handy list:

Audit ClassIFCIDsWhatWhy
1140Authorization failuresPossible brute force password attack
2141GRANTs and REVOKEsPossible temporary raising of privilege
3142CREATE/ALTER/DROP of table with AUDIT attributeRemoving the AUDIT attribute is a big red flag
41431st Change of table with AUDIT attributeSensitive data updates
51441st Select of table with AUDIT attributeSensitive data usage
6145BIND/PREPARE using table with AUDIT attributePossible usage of sensitive data
755, 83, 87, 169, 319SET CURRENT SQLID, end of identify, end of signon CICS/IMSElevation of privilege
823, 24, 25, 219, 220Db2 UtilitiesUNLOADing data to where?
9146, 392For customer usen/a
10269, 270, 271*

 

271 is not started automatically with Class 10

Establish reuse trusted context, trusted context CREATE/ALTER and Column Mask/Row Permission CREATE/DROP/ALTERAnything with MASKs or ROW Permissions must be checked
11361Successful accessOnly makes sense if working with AUDIT POLICY SYSADMIN or DBADMIN

So, what is missing from this picture?

Well, what about the COMMANDs to actually see what people are issuing on the machine? So you must also add IFCIDs 90 and 91. Then you do not see any of the DDL so you must start IFCID 62 to get that as you do want to see what people are ALTERing, creating and dropping don’t you?


2 – All done in design?

Once you have a system that gathers all these you must buffer them in memory and then do one of two things:

1) Write them out to a file for post-processing and enrichment later
2) Directly write them to your SIEM system of choice

I am not a fan of number 2 because if the SIEM system cannot accept the data then your data is lost…

With No. 1 you can keep processing even if the SIEM system cannot ingest, for whatever reason, and as long as you trigger yourself every 5 – 10 minutes or when the OPx buffer is full then that is “real time” enough for me!


3 – Care for something else for the weekend?


Plus you can enrich the data with extra stuff that the naked IFCIDs do not have attached to them e.g. map dbid/psid to a real database and tablespace etc.


Store all of this data in a bunch of Db2 tables


then kick-off a Db2 to a LEEF re-formatter program that reads all the audit data that has been collected since last time


writes it all out.

4 – Code Page – don’t forget the Code Page

Then you must do an ICONV for code page conversion, swiftly followed by a gzip to get it down to size and then copy it down to a USS file ready for your SIEM system to ingest its prey.


5 – Done, Finished and Complete

To prove that this system is always available you should also schedule a simple

GRANT SELECT ON SYSIBM.SYSDUMMY1 TO PUBLIC;

Every morning at 06:00 which you should see in the output every day. Remember – Due Diligence!


 

So now you are done and finished and can let the SIEM system do the completion!

 

As always, any questions or comments would be most welcome!

TTFN,

Roy Boxwell

PS: Please note that the bold of parts of the GDPR text is only from me!