Taxation logo taxation mission text

Since 1927 the leading authority on tax law, practice and administration

What a Performance!

06 March 2002 / Will Heard
Issue: 3847 / Categories:

WILL HEARD BA surveys the Inland Revenue's investigations results from 1991 to 2001

WILL HEARD BA surveys the Inland Revenue's investigations results from 1991 to 2001

ON 9 JANUARY 2002, the Chancellor was asked how many investigations for tax fraud had been carried out since 1997 (see Taxation, 24 January 2002 at page 378). The Treasury Minister, Andrew Smith, said that the Revenue carried out 'inquiries, as opposed to investigations, into tax returns as part of its work in tackling non-compliance'. He said that there was initially no assumption of fraud or negligence, and that while most inquiries were undertaken on the basis of a 'perceived compliance risk', this was not always true. A successful inquiry was one that was 'carried out thoroughly and in accordance with appropriate codes of practice'.

The minister added, 'Detailed information about the number of inquiries undertaken in each year and the additional liabilities established as a result of the Revenue's work in tackling non-compliance is set out in the Inland Revenue annual report for each year'.

This article has three main purposes. The first is to examine the results of Revenue enquiries into non-compliance over the past ten years using only the statistics published in the Board's annual reports and to draw conclusions therefrom. This is easily done with the minimum use of long, detailed, turgid tables and graphs, since the second purpose is to point out that published statistics on the Revenue's performance are totally inadequate. The third purpose will be stated later in the article.

Why examine such information?

The performance statistics are of interest for three reasons. The first is because the way that the Revenue tackles non-compliance has undergone a major upheaval in the last ten years through reorganisation and the introduction of self assessment. An interested taxpayer wishing to know whether this has resulted in a more efficient Revenue service would be forgiven for failing to draw any conclusions if the statistics in the Board's reports are anything to go by. The success or failure of the introduction of different operational standards and techniques for tackling non-compliance heralded by the introduction of self assessment is of legitimate concern. Taxpayers and people with a professional interest in the matter are being poorly served by the mandarins who splash white gloss paint all over the Revenue's official publications.

Secondly, whether or not the introduction of self-assessment inquiry techniques has been successful, we have a right to know how well our civil servants are performing their tasks in our name and how they are doing so. Otherwise, there is no way of knowing how well civil servants at all levels, but particularly at the top, are doing their jobs. They should be more publicly accountable.

Consequently, the third purpose of this article is to request the Revenue to publish more detailed statistics (the writer's wish-list appears further on) and to state whether or not the introduction of new inquiry techniques has been a success, and why. The Revenue should enable proper consideration of this subject by issuing far more sensitive indicators of performance, so that proper comparisons could be made and useful conclusions drawn.

The statistics

The source of the following figures is the Board's report for the given year in each case. All figures are to 31 March.

Yield to cost ratios – local tax districts

The yield to cost ratio has been extracted from the Board's reports on the basis that it is probably the most meaningful figure available enabling a comparison of success rates through time. Moreover successive reports give a definition of the yield to cost ratio. Even this definition appears to change in some years although the writer believes that this is due to lack of rigour on the part of the report's author, rather than a real change in definition. The definition in the Board's report for year ended 31 March 1992 is:

'in calculating these ratios the direct yield is compared with the costs of salaries, accommodation and other direct operating overheads met by the department. No account is taken of the corrective or deterrent effects; although largely unquantifiable, they are almost certainly substantial.'

The following points should be noted in relation to Table 1:

  • Lack of detail in the Board's reports does not permit a precise definition of an investigation, but it was generally a case (pre self assessment) which led to a contract settlement involving tax, interest and penalties. The modern equivalent is a full enquiry.
  • Company accounts and income tax compliance involves cases where the penalty was for failure to file returns or notify chargeability where no investigation of the return has taken place. The term 'compliance' in relation to Schedule D and Schedule E compliance is not properly defined in the Board's reports. Presumably it refers to cases that have been dealt with without the imposition of penalties.
  • The term 'employer compliance' replaced 'Schedule E Compliance' from 1994-95. The combined results for Schedule E compliance and local pay-as-you-earn audit groups are stated in Table 2 as from 1994-95.
  • 'Intelligence work' refers to the detection and rehabilitation of people working in the informal economy, i.e. ghosts and moonlighters.

Table 1: Local tax offices – yield to cost ratios

 

Company investigations

Company accounts compliance

Income tax cases

Income tax accounts compliance

Schedule D compliance

 Schedule E compliance

1991-92

10.0:1

Probably included in col 1

5.5:1

Probably included in col 3

5.0:1

17.0:1

1992-93

6.0:1

39.0:1

4.9:1

18.0:1

5.0:1

16.0:1

1993-94

4.0:1

40.0:1

4.0:1

18.0:1

5.0:1

13.0:1

1994-95

3.3:1

27.5:1

3.4:1

16.1:1

4.2:1

n/a

1995-96

2.7:1

34.9:1

3.0:1

12.9:1

4.2:1

n/a

1996-97

2.0:1

26.9:1

2.9:1

12.2:1

3.6:1

n/a

1997-98

3.6:1

Included in col 1

4.1:1

Included in col 3

3.2:1

n/a

 

 

Company full enquiries

Company aspect enquiries

Self assessment full enquiries

Self assessment aspect enquiries

Intelligence work

1997-98

3.6:1

10.6:1

n/a

n/a

3.2:1

n/a

1998-99

2.0:1

7.6:1

0.3:1

 3.3:1

1.2:1

n/a

1999-00

1.4:1

7.9:1

0.6:1

2.6:1

1.8:1

n/a

2000-01

2.2:1

9.8:1

1.1:1

2.6:1

0.9:1

n/a

The following two points should be noted in relation to Table 2:

  • From 1994-95 Enquiry Branch and Special Office were amalgamated to form Special Compliance Office.
  • In the same year, local pay-as-you-earn audit groups were integrated with local tax districts. The combined figures for pay-as-you-earn audit and Schedule E compliance work are now stated under the heading 'Employer compliance'.

Table 2: Yield to cost ratios – head office groups and PAYE audit/employer compliance

 

 Inland Revenue head offices

 PAYE audit groups

Enquiry Branch

Special Office

Special Compliance Office

Local PAYE audit groups/employer compliance

National PAYE audit

1991-92

23:1

33:1

n/a

6.0:1

10.0:1

1992-93

23:1

35:1

n/a

7.0:1

12.0:1

1993-94

21:1

24:1

n/a

5.0:1

10.0:1

1994-95

n/a

n/a

20.0:1

7.0:1

12.0:1

1995-96

n/a

n/a

16.5:1

5.9:1

12.1:1

1996-97

n/a

n/a

20.4:1

5.2:1

11.7:1

1997-98

n/a

n/a

14.4:1

4.5:1

16.8:1

1998-99

n/a

n/a

14.6:1

3.0:1

7.5:1

1999-00

n/a

n/a

15.2:1

3.1:1

5.6:1

2000-01

n/a

n/a

15.2:1

2.7:1

7.0:1

What the figures mean

The reader may gain little enlightenment from these figures but they appear to show that there has been a major decrease in yield to cost over the past ten years on almost every count and that the first few years of enquiry work under self assessment have been an unmitigated disaster for the Revenue.

The writer is inclined to think that the Revenue should go back to the drawing board. There should be a proper digest of statistics relating specifically to inquiry work which supplies sufficient information so that the Public Accounts Committee, Members of Parliament, academics and other interested parties could draw useful inferences from the data and see how well the Revenue is responding to changes in trends. This is not possible currently: taxpayers do not know whether the Revenue's tax investigations resources are properly directed nor whether they should be increased or decreased. The Revenue cannot be called to account, nor can it be congratulated on doing a fine job.

It is certain that in recent years the Revenue has been equipped with increasingly draconian powers of investigation which, in the writer's opinion may well be justified. However, without appropriate information, it is impossible to know for sure.

What information should be published?

The following list is drawn from nearly 28 years of tax investigations experience split almost equally between employment in the Inland Revenue and employment/self employment in the tax investigations community. The need for brevity negates full explanation as to why such information is required, but most people with tax investigations experience will sympathise. Statutory references are to the Taxes Management Act 1970 unless otherwise stated. The information required below would have to be analysed as to head office, tax regions and local tax districts where appropriate.

  • search operations under section 20C carried out annually;
  • prosecutions taken under various laws including section 144, Finance Act 2000 (fraudulent evasion of income tax);
  • orders for information issued under section 20BA;
  • section 20(1) and section 20(3) information notices applied for and granted;
  • section 20(2) information notices granted by the Board;
  • section 20A powers operated annually and related section 99 settlements;
  • section 9A notices issued annually;
  • section 19A notices issued annually and fines imposed;
  • section 29 discovery assessments issued each year;
  • closure notices under section 28A(5) issued each year by Inspectors;
  • applications for closure notices under section 28A(6) made by taxpayers and the results;
  • the number of cases brought before the General Commissioners annually taken to resolve disputes allied to tax investigations and the results (for or against the taxpayer);
  • the number of referrals to the General and Special Commissioners annually for information under the Commissioners' Regulations in Statutory Instrument 1994 Nos 1811 and 1812;
  • all tax yield information (as currently published) and ratios to be analysed by region and tax district;
  • the average financial penalty rate for each type of penalty (section 93 to section 98A inclusive);
  • substantially more detail on the performance of national and local employer compliance audit groups.

All this information is well known to the Revenue and would not cost a great deal to collate for external consumption. Readers may suggest that this wish-list over-emphasises the point being made, but let us take two examples of how important figures are buried, either deliberately to avoid embarrassment, or because the current statistics are too bland for anyone to notice.

Table 10 of the latest Board's report contrasts the results of 'Tackling non-compliance' for the two years to 31 March 2001. In 2000, the result of additional liability for National Insurance contributions was £259 million. There is no comparable figure for 2001. Instead there is a note that states 'Yield now incorporated with mainstream employer compliance activities'.

There is no other relevant statistic in the Board's report that shows a sudden leap of £259 million or anything like that figure. Indeed the only set of figures that would be likely to absorb the 2001 results for additional National Insurance contributions is that under the heading 'Employer Reviews' in the same table. The comparable figures for 2000 and 2001 total £195.7 million and £194.1 million respectively.

This is either sloppy wording, or we must conclude that the gains from tackling National Insurance non-compliance in 2001 were so poor that they could not be published.

The second example is taken from statistics that the Revenue had to supply on the order of the Public Accounts Committee the last time (in the writer's knowledge) that any serious questions were asked of the Revenue's performance in tackling non-compliance. Details are to be found in the 31st report of the Public Accounts Committee headed 'Inland Revenue: Employer Compliance Reviews' and cover non-compliance in 1995-96. Further information supporting this report is also to be found in the Report by the Comptroller and Auditor General HC 51 June 1997.

Among many other interesting statistics (for example, 58 per cent of pay-as-you-earn audits among accountants found non-compliance with an average yield of £2,800 compared with 54 per cent of audits of fish and chip shops with an average yield of £1,600), it was discovered that there was a staggering disparity of results between regions. The following disparities are particularly worthy of note:

  • The average yield per member of staff (probably at national audit group level) varied from £100,000 in the south west region compared with over £300,000 in the north west region.
  • The proportion of compliance reviews carried out by local employer compliance units in Scotland where irregularities were discovered varied from 40 to 90 per cent.
  • In 1995-96 staff in the east region were twice as likely to impose a penalty than those in Northern Ireland and staff in the south west region imposed a financial penalty twice that imposed by staff in Scotland.
  • Penalties were imposed in only one in ten reviews where unpaid liabilities were found and the average rate of penalty was 14.6 per cent.

The overall conclusion was that the Revenue should make more effective use of information technology and 'better use of its management information to establish why some compliance teams seem to be more efficient and effective than others'.

Did the Revenue take these observations to heart? Are regional performances now more uniform and is the average penalty in an employer compliance case more or less than 14.6 per cent? How is anyone to know? All that we know is that on the basis of cost to yield ratios the national audit groups now yield 7:1 compared with 12:1 in 1995-96 and the local employer compliance units 2.7:1 compared with 5.9:1 in 1995-96. This, by any reckoning, amounts to a staggering fall in performance. By the way, what happened to our £259 million National Insurance money?

 

Will Heard is a specialist in tax investigations work operating from the Midlands. He may be contacted on 01676 532159, www.specialtax.co.uk.

 

Issue: 3847 / Categories:
back to top icon