clearly. For example, award-winning reports are more likely to use crossreferencing to increase connectivity between sections and ensure a more integrated discussion. We also find that award-winning reports are grammatically less complex and contain more relevancy markers signalling important content. These language and presentational features help to improve ‘cognitive processing’. Our app measures these features and ranks reports according to content and ease of cognitive processing. More than 26,000 documents published between 2003 and 2017 by companies listed on the London Stock Exchange have been analysed and scored on features such as length, readability, sentiment, and overall reporting quality. Average report readability – measured using an algorithm that penalises long sentences and complex words – is generally poor; and there has been no noticeable improvement over the sample period. Long, unstructured documents containing complex language mean many retail investors and other non-specialist stakeholders struggle to understand the typical annual report. Sentiment also varies dramatically across different sections within the same report. For example, sections where regulation and compliance shape content – such as governance statements and remuneration reports – are characterised by neutral language. In contrast, the tone of language is up to four times more positive in sections where management have more reporting discretion and where performance is the primary focus. Already, the practical applications of CFIE-FRSE are apparent. We are working with the Financial Conduct Authority to develop methods to improve the monitoring of companies by looking at what they disclose, and whether that can help predict future outcomes, particularly negative ones, such as accounting scandals. The Financial Reporting Council (FRC) has worked with us on various aspects of financial reporting. We have looked at strategy reporting and whether the information is useful. Just because management reports some information on strategy does not mean it is going to be useful: the devil is in the detail. The FRC are especially interested in understanding whether, when discussing strategy, management provide bland generic platitudes that fail to disclose any useful information; or whether they are actually saying something meaningful. We have also compiled a report, Hidden Talent, with the Pensions and Lifetime Savings Association around workforce reporting, looking at how much disclosure firms provide around aspects of their workforce, including training, health and safety, and wellbeing. The CIPD, which represent HR professionals, has expressed an interest in developing a framework for how companies should report on their workforce. The analyses we are able to do can test what firms currently say and hence what scope exists for further improvement. Finally, the IRS use aspects of our data and analyses to aid their best Practice Awards judges. The applications are many-fold, and as we refine our work further, we hope to make complicated and extensive annual report disclosures easier to access and analyse. Steve Young, is a Professor in the Department of Accounting and Finance. The original paper Retrieving, classifying and analysing narrative commentary in unstructured (glossy) annual reports published as PDF files, was co-authored by Mahmoud El-Haj, Paulo Alves, Paul Rayson and Martin Walker. s.young@lancaster.ac.uk FIFTY FOUR DEGREES | 15 One of the questions we are addressing using our app and data is: what makes a ‘good’ annual report? ʻʻ ʼʼ
RkJQdWJsaXNoZXIy NTI5NzM=