Article PDF Available

A solution for real time monitoring and auditing of organizational transactions

Abstract and Figures

The controlling and auditing of organizational transactions in real time allows to determine the degree of reliability with which they are carried out, mitigating the organizational risk. This paper presents a solution proposal under a new vision for organizational auditing and monitoring in real time since it is focused on the implementation of continuous assurance services in organizational transactions in compliance with the formalisms of a business ontological model. Furthermore, this paper contributes for a new paradigm of the transactional auditing, which is intended to be at a very low and detailed level of organizational transactions.

Conceptual architecture of the proposed solution

Figures - uploaded by Henrique Santos

Author content

All figure content in this area was uploaded by Henrique Santos

Content may be subject to copyright.

ResearchGate Logo

Discover the world's research

  • 20+ million members
  • 135+ million publications
  • 700k+ research projects

Join for free

Author content

All content in this area was uploaded by Henrique Santos

Content may be subject to copyright.

Content may be subject to copyright.

Procedia Technology 5 ( 2012 ) 190 198

2212-0173 © 2012 Published by Elsevier Ltd. Selection and/or peer review under responsibility of CENTERIS/SCIKA - Association for Promotion

and Dissemination of Scientific Knowledge

doi: 10.1016/j.protcy.2012.09.021

CENTERIS 2012 - Conference on ENTERprise Information Systems / HCIST 2012 - International

Conference on Health and Social Care Information Systems and Technologies

A solution for real time monitoring and auditing of

organizational transactions

Rui Pedro Marques

a,

*, Henrique Santos

a

, Carlos Santos

b

a

Algoritmi, Universidade do Minho, Campus de Azurém 4800 Guimarães, Portugal

b

Govcopp, Universidade de Aveiro, Campus de Santiago 3810 Aveiro, Portugal

Abstract

The controlling and auditing of organizational transactions in real time allows to determine the degree of reliability with

which they are carried out, mitigating the organizational risk. This paper presents a solution proposal under a new vision

for organizational auditing and monitoring in real time since it is focused on the implementation of continuous assurance

services in organizational transactions in compliance with the formalisms of a business ontological model. Furthermore,

this paper contributes for a new paradigm of the transactional auditing, which is intended to be at a very low and detailed

level of organizational transactions.

© 2012 Published by Elsevier Ltd. Selection and/or peer-review under responsibility of

CENTERIS/HCIST.

Keywords: organizational transactions; management information system; continuous assurance; monitoring; auditing; real time; risk

profiles;

* Corresponding author. Tel.: + 351 253 510180; fax: + 351 253 510189.

E-mail address: ruimarques@ua.pt .

Available online at www.sciencedirect.com

© 2012 Published by Elsevier Ltd. Selection and/or peer review under responsibility of CENTERIS/SCIKA -

Association for Promotion and Dissemination of Scientific Knowledge

191

Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

1. Introduction

Currently organizations are facing several challenges, for example their organizational transactions have

grown in volume and complexity and they are living in highly regulated business environments. Thus,

controlling and monitoring mechanisms are needed in order to evaluate and validate all transactions, in a

comprehensive manner, to meet the controls and regulations. However, the traditional audit process occurs

mostly after the completion of transactions, since it is not feasible to audit them in time. Thereby it makes it

possible to inhibit the risk associated to their execution. Therefore, for many organizations there is a

significant risk of errors and fraud and these are not detected in time, resulting in a negative impact on

organizations. See, for example, the current global financial crisis and successive well-known scandals in

some organizations, such as Lehman Brothers, A-Tec, Madoff, Kaupthing Bank, WorldCom, Enron, Parmalat

and Tyco cases and many others [1-4].

Thus, any organization must be sufficiently prepared to survive, regardless of exposure and of the large

number of risks it is subject to, by implementing a suitable system of Continuous Assurance in accordance

with applicable legislative and regulatory framework. Continuous Assurance has been assuming an important

role within the organizational context because it is the application of emerging information technologies to the

standard techniques of auditing. "Continuous" does not mean real time, but it means to be effective,

considering and being consistent with the pulse and rhythm of each organizational transaction and process [5,

6].

These aspects have propelled to create a new awareness of corporate governance and of the growing

importance of monitoring and controlling the various organizational transactions (that is, any activity

performed within a business process). Along with this evidence, a study by PricewaterhouseCoopers [7]

examined various organizations and concluded that abou t 89% of participating organizations intend to adopt

more solutions of continuous auditing and monitoring by 2012.

1.1. Motivation

Given the foregoing, it is necessary to find solutions which allow organizations to evaluate, monitor and

validate their transactions continuously and independently, preferably in a non-intrusive way. The

optimization of the operational performance will be also possible if this auditing is done in real time (in the

shortest time possible after its execution), reducing in this way the associated risks.

Alongside this, there is another aspect to consider in relation to organizational transactions: risk profiles. In

this context, risk profiles refer to the classification of different types of behavior that may occur in the

execution of one transaction. In this work, two terms are considered to characterize risk profiles: negative

profiles, which refer to all unwanted behaviors during the execution of transactions, for example incomplete or

poorly executed operations; lack of crucial procedures; non-conformities; delays; incongruities and

malfeasance and positive profiles, which refer to all valid and appropriate events [8, 9].

Thus, this paper focuses on answering to the implementation of real time assurance services, having as

support the organizational transactions according to an ontological model of organizational transactions. An

ontological model is important because it helps to understand the essence of the organizational transactions

and processes and their relationships and characteristics. In parallel, a simpler business view, detached from

any ontological representation, results in the inability to generate organizational knowledge [10]. Therefore,

this work intends its prototype to be a system with a broader and detailed vision of organizational processes

and transactions, thus respecting the formalities of an ontological model capable of representing the

organizational reality. The work presented in this paper is supported by "Enterprise Ontology", the model

proposed by Dietz [11].This model is adapted to represent the essential structure of the organizational

192 Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

transactions with no significant complexities but simultaneously with coherence (i.e. parts constituting an

integral whole), consistency (there is no contradiction or irregularities), comprehension (all the important

issues are handled) and concisely (i.e. model does not contain superfluous matters). Furthermore, it has been

applied successfully in some practical projects in recent years [12].

This paper is structured in six sections, including this introduction to the topic and the motivation. In the

next section follows a brief literature review. Then, the solution proposal is presented, one that shows

evidence of being effective in achieving the stated objectives. Section 4 presents the main results which are

intended to be achieved and overviews a methodology for implementation and evaluation of the proposed

solution. Finally, the last section presents the authors' conclusions.

2. Literature review

This section aims to present some paradigms and concepts associated with the monitoring and auditing at

the level of execution of the organizational transactions and processes. Some researches and applications

related to the topic are also presented. Thus, the concepts which are here evinced have features and

specificities similar to those presented for the prototype conceptualized in this paper.

The first two concepts intended to refer are Business Activity Monitoring (BAM) and Business Process

Management (BPM). BAM allows that events generated by various applications and systems of an

organization, or by services of inter-organizational cooperation, can be processed in real time in order to

identify critical situations in the performance indicators. It aims to obtain a better insight into the business

activities, and thus improve their effectiveness. BAM identifies and analyzes in real time the cause-effect

relationships between events, enabling the system and/ or staff to take effective and proactive measures in

response to specific identified scenarios. It allows, for example the early detection of abnormal events in

business processes as a whole, or some of their constituent parts [13, 14]. In turn, BPM is defined as

"supporting business processes using methods, techniques and software to design, enact, control and analyze

operational processes involving humans, organizations, applications, documents and other sources of

information" [15]. The BPM systems are complex assemblies of software components and tools that together

provide features which allow us to develop, deploy and implement solutions based on business processes.

Moreover, they enable the visualization, monitoring and management of events within the business process,

and allow the highest-level visualization of the state of the execution of business processes, reducing the

causes of the occurrences of exceptions [14].

Another interesting concept to present here is Complex Event Processing, (CEP) which includes methods,

techniques and tools to process events in real time. CEP analyzes a series of data in real time and identifies

patterns and generates events that can be processed and treated. This processing is done in memory and its

logic is defined by a series of queries on all received data [16, 17]. In short, CEP is capable of processing high

amount of data from different sources; operates in bitstream; has low latency; has limited processing window

and can handle different types of operations on data, such as filtering, correlation, aggregation and association

patterns.

2.1. Related work

Researchers from the Brandeis University, Brown University, and MIT carried out the project Aurora [18].

This project was designed to handle and manage a very large amount of data streams and allows its users to

create their own queries from a set of available operators. These operators are connected to other operators or

may simply provide results. These operators may derive from the output data from other operators or external

data sources. Aurora is capable of optimizing the query considering the QoS indexes provided by the operators

193

Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

and other indexes and system inputs specified by the users. This was a precursor of other identical monitoring

systems of data flow, e.g. Medusa [19] and Borealis [20]. Like Aurora, the project STREAM [21] is a Data

Stream Management System. The STREAM supports a large number of declarative continuous queries over

continuous data streams and/ or over traditional data repositories. The monitoring is done by controlling the

results of queries made.

The work EasyCredit [14, 22] is an example of successful implementation in the banking sector. It is a

system like BAM, using the concept of CEP and the pipeline used in the management and monitoring of credit

transactions in real time.

Some works on this topic, in which event monitoring is done using log records, were also found. Within

this group of works surveyed, one of them uses mining tools and CEP to analyze records of a database log in

real time, and presents the sequence and the model of the transactions analyzed [16]. Furthermore, using

mining techniques on log records, the possibility of recognition of events was demonstrated. It is able to link

events that are not associated a priori with any workflow or process model to a new model, in other words it

contributes for the discovery of new process and transactional models [23].

Another work of reference [24] describes an approach for automating the discovery of patterns of activity

in organizational process models through an ontology. This discovery of patterns is done through a mapping

between the elements collected in the processing of the process and the elements of the ontology. This is done

primarily to verify if the process contains the necessary elements to meet the definition of each process

pattern.

3. Solution proposal

This chapter aims to present a proposal for a solution which shows evidence of being effective in achieving

the stated objectives. In this proposal the requirements that must be implemented are presented along with a

possible conceptual architecture to meet these requirements. However, this section does not aim to describe

technologically the solution, but to clarify, from a conceptual point of view, the purpose of each one and their

relationships, following the alignment of the requirements which will be described.

A solution that addresses the problems and motivations presented in this paper should meet some

requirements. The first one is the necessity to conceptualize a layer of non-intrusive internal control

mechanisms in order to be incorporated in the operational information system (e.g. in the ERP). This system

supports the execution of the organizational transactions to be monitored and audited. These internal control

mechanisms, when embedded into ERP system, must be aligned with the ontological model of Dietz. In other

words, the design of these mechanisms will have to take into account the different types of events, stages and

relationships that constitute the essence of each transaction. Furthermore, a component that manages and

stores the results which derive from the internal control mechanisms is also needed. Another tool is required in

order to devise for the extraction of results of the internal control mechanisms and their transformation and

storage in the previous component. This extraction of data provided by internal control mechanisms should be

made as soon as possible after the occurrence of the event monitored by the respective control.

A key requirement of this proposal is the development of risk profiles repository. It should be able to

maintain and manage the known negative and positive profiles of each organizational transaction to be

monitored and audited. These profiles must be also modeled according to the ontological model. Another key

requirement is the development of a module which compares the data from the internal control mechanisms

with the records maintained in the risk profiles repository. In addition, it must be able to determine which

profile is being followed by running each transaction. Moreover, if the situation that is running is unknown

and is not classified in the risk profiles repository, it is intended that this module is capable of introducing this

194 Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

new behavior in the repository. Then, it should require the classification of this new profile by the responsible

of the system, now designated as transactional auditor.

Finally, the on-line results of the comparison module presented above should be stored in a repository in

real time. This repository will contain the history of the results of the transaction auditing and monitoring and

a picture of the current situation regarding the organizational transactions still in progress. This component

will allow an interface with the transactional auditor, queries, the preparation of audit reports, notifications

and alerts.

The architecture represented schematically in Fig. 1. was conceptualized based on the sought objectives

and in the general requirements specified. From the analysis of this architecture, it is perceived that the

proposed solution is intended to be permanently connected to the organizations operational information

system. In other words, the internal control mechanisms should be incorporated in the ERP in order to monitor

the status of the various phases and stages defined by the ontological model of organizational transactions.

Thereby, they are able to support the proposed solution, and consequently the monitoring and auditing of

organizational transactions.

The component represented by number 8 is the element responsible for the extraction of the state of

internal control mechanisms and the data they may provide in order to integrate all this information in a

repository (component 1). This will manage and maintain this information referring to the various states of

execution of transactions. Component 2 illustrates the risk profiles repository of the organizational

transactions to be monitored and audited.

The component represented by number 3 is the module aimed to be able to compare the various records in

the risk profiles repository (via the data flow 6) and determine which profile is being followed by running

each transaction, according to information received by component 1 (through the data flow 5). Furthermore, if

we are facing a situation which is not classified in the risk profiles repository, it is intended that this module is

capable of introducing this new behavior in the repository (via the data flow 7) and subsequently be classified

as positive or negative profile. The results carried out by the comparison module (component 3) are sent to a

repository (component 4), on which an interface for viewing current and historic state of controlling of the

audited transactions should be developed. The auditing repository should notify and alert the transactional

auditor when a negative behavior occurs, e.g. sending an e-mail or a sms.

195

Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

Fig. 1. Conceptual architecture of the proposed solution

3.1. Technical considerations

CEP would be tendentiously the choice in order to develop the component responsible for the detection and

identification in real time of the risk profile. This, considering the set of facts presented in the literature review

and the fact that the risk profile is being followed in execution of the various organizational transactions to be

monitored and audited. Such a choice would be due to the features and performance of this paradigm in the

processing of large amount of events and its ability to respond in real time. However, there is a requirement of

the prototype which indicates that it must work with the organizations operational information systems in a

non-intrusive way, which puts into question, in some part, the use of CEP in the development of this

component. CEP is primarily designed to work with transient data, and data processing is done in memory. In

turn, the fact that the system must be non-intrusive, the functional architecture is designed so that it acts upon

data resulting from the execution of transactions in operational systems. In other words, the data to be

processed will be persistent.

Databases are an option to process persistent data, with the advantage that they do not have specified time

intervals contrary to the CEP processing. Because the system acts directly in the database of the operational

systems, it means that the organizational events associated with persistent data have already occurred. Thus,

the use of triggers in operational databases is a way of detection of events, since the insertion, edition or

deletion of a record or a record field means the occurrence of an event of a given organizational transaction.

Then, the component will render the activation of these triggers as if they were an occurrence of an event.

The "real time" is advertised as one of the requirements of the system and is defined within this work as the

time interval closer to the occurrence of an event, respecting the rhythm of execution of organizational

transactions. Thus, it seems that a database approach is sufficient to the functions of monitoring and auditing,

despite not having a lower latency as CEP. This, because the purpose of the system is not to act in a direct and

intrusive way on organizational execution of transactions, but rather to work with reports and alerts. Consider

196 Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

that the time and the rhythm of organizational transactions are variable and in different orders of greatness (the

same transaction may have running times in the order of minutes, days or several months depending on the

situation in question).Therefore, the time in which the user of the prototype has to react in a corrective way,

after being alerted of an anomaly, is also variable, depending on the pace of the transaction in question. "A

real-time system is one, where the correctness not only depends on the functionality but also on the timeliness

of this functionality" [25].

4. Methodology and results

The deployment of the solution proposed above will intend to achieve some results. The first one is to

ascertain whether it is feasible to build a repository of risk profiles, following the structure and pattern of the

transaction axiom of Dietz's ontology, managing multiple positive and negative profiles of organizational

transactions. Another intended result is to demonstrate that the repository of risk profiles is a crucial element

in the real time monitoring and controlling of organizational transactions, checking if this repository has the

information needed for an analysis and evaluation of how transactions are being carried out.

The development and implementation of internal control mechanisms capable of providing information

about the transaction state in its various phases defined in Dietz's ontology are another result to be reached.

Finally we intended to attest that the proposed solution, implemented in accordance with the vision presented

in the research problem, is able to respond in real time about the state of execution of organizational

transactions, thus constituting a system with Continuous Assurance.

Based on the research problem presented, the solution projected and the results envisioned, triangulation is

the research approach proposed, followed by a qualitative approach, combined with aspects of a quantitative

approach [26, 27]. The choice of a qualitative approach is due to the fact that the research in question was

more targeted, in a general way, to aspects of management and organizations [28]. However, the quantitative

approach is justified because there will be a deployment of a prototype able to provide the collection and

analysis of some data that may lead to findings from the technological point of view [26].

For the classification of the interpretation of research, the positivist and interpretive epistemologies are the

ones to be used. The positivist epistemology will be used to objectively observe and analyze the results of

scientific investigation from the technological point of view [29]. Simultaneously, the interpretive

epistemology will be used to validate the resolution of the problem and understand the value of this result to

an organizational environment. However, the final interpretation is a partial analysis because it will be based

on a limited set of organizational transactions, subject of study [26, 30].

To conduct the research, the case study is the methodology which best suits the problem presented, because

the research is more empirical, investigating the feasibility of a system prototype in a real (simulated) context

and the resolution of organizational problems [26, 31]. Finally, the observation seems to be the appropriate

research technique/ tool to validate the raised research hypotheses. This technique is based on the observation

of a set of phenomena in order to collect data, on a systematic basis, about the behavior of the prototype. The

combination of indirect and direct observation seems to yield interesting results, by confrontation of users'

opinion with researchers' opinion [32, 33].

To concretize the case study we intend to deploy the prototype and evaluate it in an organizational

environment, and to this end, we aim to use the curriculum unit "Enterprise Simulation" from the Higher

Institute of Accounting and Administration of the University of Aveiro. This yields a controlled environment,

and also allows the application of the prototype in different organizational areas. "Enterprise Simulation",

included in the last year of the degree in Accounting, aims to simulate the organizational activities. These

activities incite dozens of groups of students to create their own enterprise (in one of various organizational

areas, like services, commerce, industry and public services), develop its operations in the business during an

197

Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

operational period in accordance with the economic calendar and prepare and disseminate financial statements

[34]. The university provides a well-structured simulation environment that is very close to reality and an

infrastructure of information systems and that covers all the needs of organizations in their business activities.

The internal control mechanisms described in the architecture of the solution proposal will be incorporated in

this infrastructure of information systems which the university provides.

5. Final considerations

A solution with assurance services capable of continuous monitoring of organizational transactions in

compliance with the formalisms of a business ontological model is an innovative vision. This because

transactions are monitored and audited at a very low level, contrary to what happens in most monitoring of

transactions that occurs at a high level (for example, comparing whether a completed transaction followed a

set of established procedures). Another innovative vision presented in this paper is the implementation of a

repository that contains and maintains the risk profiles of the transactions to monitor and audit, following the

presented ontology.

This paper contributes for this new vision with a proposal of a conceptual architecture of a management

information system which aims continuous monitoring of organizational transactions executed and supported

exclusively in digital format supported by a business ontological model.

References

[1]. Markham, J. W., Financial history of modern United States corporate scandals Vol. 1. New York: M.E. Sharpe 2006

[2]. Verver, J. Risk Management and Continuous Monitoring. Retrieved March 1, 2011. Available from:

http://www.acl.com/pdfs/0303-Auditnet.pdf.

[3]. Bodoni, S. Kaupthing Creditors, Madoff, A-Tec, Lehman Brothers: Bankruptcy. Retrieved June 30, 2011. Available from:

http://www.bloomberg.com/news/2010-11-25/kaupthing-creditors-madoff-a-tec-lehman-brothers-bankruptcy.html.

[4]. Cohen, J., et al., Corporate Fraud and Managers' Behavior: Evidence from the Press. In: Journal of Business Ethics, R. Cressy, D.

Cumming, C. Mallin, Editors. Springer Netherlands. 2011; p. 271-315.

[5]. Morais, M. G. C. T., A importância da auditoria interna para a gestão: caso das empresas portuguesas, In 18º Congresso Brasileiro

de Contabilidade 2008: Gramado , Brazil. p. 1-15.

[6]. Vasarhelyi, M. A., M. Alles, K. T. Williams, Continuous assurance for the now economy. 1st ed. Sydney: Institute of Chartered

Accountants in Australia; 2010

[7]. PricewaterhouseCoopers, Internal Audit 2012. New York: PricewaterhouseCoopers LLP; 2007

[8]. Santos, C., Modelo Conceptual para Auditoria Organizacional Contínua com Análise em Tempo Real. 1 ed: Editorial Novembro;

2009

[9]. Denning, D. E., An Intrusion-Detection Model. IEEE Transactions on Software Engineering , 1987. 13: p. 222-232.

[10]. Hepp, M.,D. Roman, An Ontology Framework for Semantic Business Process Management. Proceedings of Wirtschaftsinformatik ,

2007.

[11]. Dietz, J. L. G., Enterprise Ontology: Theory and Methodology . New York: Springer-Verlag Inc.; 2006

[12]. Albani, A.,J. L. G. Dietz, Enterprise ontology based development of information systems. International Journal of Internet and

Enterprise Management, 2011. 7(1): p. 41-63.

[13]. McCoy, D. W., Business activity monitoring: Calm before the storm, In Technical Report LE-15-9727 . 2002, Gartner.

[14]. Brandl, H.-M.,D. Guschakowski, Complex Event Processing in the context of Business Activity Monitoring - An evaluation of

different approaches and tools taking the example of the Next Generation easyCredit.Diploma Thesis. In Faculty Information

Technology/Mathematics. University of Applied Sciences Regensburg: Regensburg, 2007

[15]. ter Hofstede, A., W. van der Aalst, M. Weske, Business Process Management: A Survey. Business Process Management. M.

Weske, Editor. 2678; Springer Berlin / Heidelberg. 2003; p. 1019-1019.

[16]. Oliveira, J. J. R. d., Descoberta de Processos em Tempo Real.Master Dissertation. In Instituto Superior Técnico. Univesidade

Técnica de Lisboa: Lisboa, 2011

[17]. Roth, H., et al. Event data warehousing for Complex Event Processing. In Research Challenges in Information Science (RCIS),

2010 Fourth International Conference on . 2010.

[18]. Abadi, D. J., et al., Aurora: a new model and architecture for data stream management. The VLDB Journal — The International

Journal on Very Large Data Bases 2003. 12(2): p. 120-139.

198 Rui Pedro Marques et al. / Procedia Technology 5 ( 2012 ) 190 – 198

[19]. Balazinska, M., H. Balakrishnan, M. Stonebraker, Load management and high availability in the Medusa distributed stream

processing system, In Proceedings of the 2004 ACM SIGMOD international conference on Management of data . 2004, ACM:

Paris, France. p. 929-930.

[20]. Abadi, D., et al. The Design of the Borealis Stream Processing Engine. In 2nd Biennial Conference on Innovative Data Systems

Research (CIDR'05). 2005.

[21]. Arasu, A., et al., STREAM: The Stanford Data Stream Management System. IEEE Data Engineering Bulletin Arasu, A, 2004.

26(1): p. 19 -26.

[22]. Greiner, T., et al., Business Activity Monitoring of norisbank Taking the Example of the Application easyCredit and the F uture

Adoption of Complex Event Processing (CEP), In Proceedings of the IEEE Services Computing Workshops . 2006, IEEE Computer

Society. p. 83.

[23]. Ferreira, D. R.,D. Gillblad, Discovering Process Models from Unlabelled Event Logs, In Proceedings of the 7th International

Conference on Business Process Management. 2009, Springer-Verlag: Ulm, Germany. p. 143-158.

[24]. Ferreira, D. R., S. Alves, L. H. Thom, Ontology-Based Discovery of Workflow Activity Patterns, In 2nd International Workshop

on Reuse in Business Process Management. 2011: Clermont-Ferrand, France.

[25]. Möller, M. O., Structure and Hierarchy in Real-Time Systems.PhD Thesis. In Department of Computer Science. University of

Aarhus: Aarhus (Denmark), 2002

[26]. Grilo, R. M. M., Investigação em Sistemas de Informação Organizacionais em Portugal.Master Dissertation. In Departamento de

Engenharias. Universidade de Trás-Os-Montes e Alto Douro: Vila Real, 2008

[27]. Patton, M. Q., Qualitative Evaluation and Research Methods. 3 ed, ed. I. Sage Publications. Newbury Park, CA: Sage Publications,

Inc.; 2002

[28]. Myers, M. D. Qualitative Research in Information Systems. Retrieved March 1, 2011. Available from:

http://www.qual.auckland.ac.nz/.

[29]. Babbie, E., The practice of social research. Belmont, CA: Wadsworth Publishing Company; 1993

[30]. Giorgi, A., The theory, practice and evaluation of the phenomenological method as a qualitative research procedure. Journal of

Phenomenological Psychology , 1997. 28: p. 235-260.

[31]. Yin, R. K., Case Study Research, Design and Methods , ed. I. SAGE Publications. Vol. 5. Newbury Park: Sage Publications; 2009

[32]. Coolican, H., Research Methods and Statistics in Psychology: Hodder and Stoughton; 2004

[33]. Quivy, R.,L. V. Campenhoudt, Manual de Investigação em Ciências Sociais. Lisbon: Gradiva; 1998

[34]. Silva, F., As Tecnologias da Informação e Comunicação e o Ensino da Contabilidade.Master Dissertation. In Departamento de

Economia, Gestão e Engenharia Industrial. University of Aveiro, Portugal: Aveiro, 2009

... Secara tradisional, proses audit terhadap transaksi yang ada umumnya dilakukan pada akhir transaksi, yaitu saat semua transaksi sudah selesai. Hal ini dikarenakan proses audit tidak lazim untuk dilakukan pada saat transaksi sedang berjalan [2]. Namun hal ini belum cukup, karena dengan mengecek keberhasilan suatu transaksi hanya dari hasil akhirnya saja tidak akan dapat menjawab secara lengkap masalah terkait bagaimana suatu transaksi dapat berjalan secara baik atau tidak. ...

... Penelitian yang dilakukan oleh Rui Pedro Marques, Henrique Santos, dan Carlos [2] mengatakan bahwa dengan mengontrol dan mengaudit transaksi secara real time dapat menghasilkan pengetahuan tentang seberapa besar kehandalan dari transaksi yang dilakukan. Hal ini menggambarkan pentingnya proses monitoring dalam untuk menvalidasi serta mengontrol suatu transaksi. ...

  • Conan Aditya Wijaya

Seiring dengan perkembangan zaman dan meningkatnya kebutuhan membuat proses bisnis semakin kompleks, tak terkecuali di bidang pendidikan. Banyaknya informasi dan sumber daya yang dimiliki oleh institusi pendidikan tinggi khususnya pada tingkat program studi akan membuat pengelolaan yang ada menjadi rumit. Untuk itu, perlu dilakukan proses monitoring dan evaluasi demi menjamin validitas transaksi yang ada. Pada penelitian ini, penulis membangun sebuah sistem yang dapat melakukan monitoring dan evaluasi terhadap pengelolaan yang ada di program studi. Sistem ini dibangun berdasarkan poin-poin penilaian yang terdapat pada borang akreditasi. Poin-poin penilaian tersebut nantinya digunakan oleh pihak program studi sebagai tolak ukur dari kualitas sumber daya yang ada. Caranya adalah dengan mengkomparasikan data-data mentah yang terdapat pada borang akreditasi dengan rumus-rumus yang terdapat pada penilaian borang akreditasi. Hasil komparasi tersebut akan digunakan oleh pihak program studi dalam melihat performa sumber daya yang ada dalam satu tahun. Sistem ini berbasis web dan dikembangkan dengan bahasa pemrograman PHP dan menggunakan sistem pengelolaan basis data MySQL.

... Table 10 shows this relationship between functional requirements, development considerations and the characteristics of Continuous Assurance. With the aforementioned, an architecture was conceptualised in order to provide a solution to the problem addressed in this thesis (R. P. Marques, Santos, & Santos, 2012b, 2012c. Figure 23 schematically represents the conceptual architecture proposed. ...

... Thus, the use of triggers in operational databases is a way of detection of events, since the insertion, edition or deletion of a record or a record field means the occurrence of an event of a given organisational transaction. Then, the component will render the activation of these triggers as if they were an occurrence of an event (R. P. Marques et al., 2012c;R. P. Marques et al., 2013c). ...

  • Rui Pedro Marques Rui Pedro Marques

The current highly regulated business environment has imposed organisations to increase their effort to monitor and manage their control mechanisms. This awareness has been propelled by the increasing emergence of new regulatory requirements on continuous monitoring and continuous auditing of organisational transactions. Furthermore, the successive well-known scandals in organisations, which have resulted in a very negative impact on their operational performance and also on their corporate image, have shown that the traditional audit process is not sufficient to meet the organisations' needs. Thus, organisations have been looking for solutions to improve and strengthen their risk control structures so as to provide greater security in the effectiveness of risk management of their activities, namely on controlling, monitoring and auditing of organisational transactions. Then, the concept of Continuous Assurance has emerged because it is the set of services which, making use of technology, uses the information immediately from organisational transactions and produces audit results simultaneously or within a short period of time after the occurrence of relevant events. Hence, this thesis focuses on the implementation of continuous assurance services in information systems in order to determine the degree of reliability with which transactions are carried out, mitigating the organisational risk. Therefore, this thesis aims to contribute to a new vision of organisational auditing focused on assurance services in transactions executed and supported exclusively in a digital format according to an ontological model of organisational transactions. The motivation and objective of this thesis led to some research challenges: • Validate a set of characteristics that any information system with continuous assurance services must provide. The literature on this topic is not very explicit upon the metrics which should be taken into consideration during the evaluation of this type of information systems. Thus, the Delphi method was used to validate a set of essential and very important characteristics for information systems with continuous assurance. In addition, this work contributes with a model comprising dimensions and metrics, which allows it to be used as a tool or as a set of guidelines to evaluate information systems with embedded control. • Ensure the feasibility of development and the effective use of an information system with full continuous assurance services, having as support an ontological model, and which is considerably flexible and adaptable in order to be applicable to any organisational transactions. Following the Design Science methodology, a proposal of a solution is presented. This proposal includes requirements, a modular architecture and the development of a prototype. All these steps were supported by an ontological model of organisational transactions so that they could be represented in a very detailed, objective and comprehensive way. Furthermore, the solution was implemented in a simulated organisational environment and its results allow to conclude that the presented architecture is an effective solution since it provides continuous assurance to any organisational transactions, having as support an ontological model. Moreover, this work demonstrates that a repository which allows the instantiation of execution patterns (risk profiles) for each organisational transaction is an important element in information systems with continuous assurance services, as a source of references to support continuous monitoring, auditing and controlling of the risk associated with the execution of organisational transactions.

... Researchers have long pointed out the importance of Continuous Monitoring (CM) and auditing of information systems. To emphasize the importance of CA of organizational transactions, Marquesa et al. (2012) proposed a solution under a new vision for organizational auditing and monitoring. There is also increasing research on the applications of artificial intelligence in auditing. ...

... These authors present a classification of CAATTs by available features, where they include Data Mining Classification which includes algorithms to explore and classify large amount of data. The trend on CAATTs is the use of mining techniques to analyse log records to detect events and allow the linkage of events that could not be detected or associated a priori [18]. ...

Computer Assisted Audit Tools and Techniques, CAATTs, represent nowadays a regular presence on Chartered Accountants', CAs, daily tasks: several previous researches state that Generalized Audit Software is present in CAs' routines and some specific tools are conquering their space among these professionals' preferences. Present research reveals that "Data Extraction and Analytics" and "Sampling" tools are the most common Information Technologies on Auditing work. Computer Assisted Techniques related to data mining are still not expressive in this reference group or are only utilized by a small group of experts, mainly at big companies. New trends on CAATTs are rising mainly as a consequence of changes in business and technology. This paper intends to draw the big picture on that topic and anticipate new trends on the area: Big Data, Cloud Auditing, Emerging Technologies will be presented. This paper discusses also how can auditors be prepared to the new trends and proposes a new classification for Computer Assisted Audit Tools and Techniques.

... In this context, risk profiles refer to the classification of different types of behavior that may occur in the execution of one transaction. In this work, two terms are considered to characterize risk profiles: negative profiles, which refer to all unwanted behaviors during the execution of transactions, for example incomplete or poorly executed operations, lack of crucial procedures, non-conformities, delays, incongruities and malfeasance; and positive profiles, which refer to all valid and appropriate events [16][17][18][19]. ...

This paper aims to present a repository which has as main objective to store and manage data derived from internal control mechanisms. The proposed repository receives information about the execution of organizational transactions which were collected by internal control mechanisms in an ERP database, thereby constituting an important tool for continuous monitoring of organizational transactions. Thus, this paper presents the repository, a possible continuous monitoring application and the necessary results to assess its effectiveness in continuous monitoring.

... Some approaches specifically address the CEP based monitoring of business processes for ensuring their adherence to the enterprise's compliance policies (Birukou et al., 2010;Kharbili and Stojanovic, 2009;Mulo et al., 2012). Other concepts intend an application of the technology in order to observe organizational transactions in terms of risk management (Marques et al., 2012) or to assure the conformity with predetermined process structures (Weidlich et al., 2011). In most cases, the approaches allow for a derivation of rules for the CEP component out of dedicated compliance-or process models. ...

Purpose – The business operations of today's enterprises are heavily influenced by numerous of internal and external business events. With the Event Driven Architecture and particularly the Complex Event Processing (CEP), the technology required for identifying complex correlations in these large amounts of event data right after its appearance has already emerged. The resulting gain in operational transparency builds the foundation for (near) real-time reactions. This motivated extensive research activities especially in the field of Business Process Management (BPM), which essentially coined the term Event-Driven BPM (EDBPM). Now, several years after the advent of this new concept, the purpose of this paper is to shed light to the question: where are we now on our way towards a sophisticated adoption of the CEP technology within BPM? Design/methodology/approach – The research methodology of this paper is a structured literature analysis. It basically follows the procedure proposed by vom Brocke et al. (2009). This verified five-step process – entitled "Reconstructing the giant" – allowed a rigorous study. As a result, various research clusters were derived, whose state-of-the-art exposed existing research gaps within EDBPM. Findings – First of all, the paper provides a concise conceptual basis on different application possibilities of EDBPM. Afterwards, it synthesizes current research into six clusters and highlights most significant work within them. Finally, a research agenda is proposed to tackle existing research gaps to pave the way towards fully realizing the potentials of the paradigm. Originality/value – So far, a comparable study of the current state-of-the-art within EDBPM is non-existent. The findings of this paper, e.g. the proposed research agenda, help scholars to focus their research efforts on specific aspects that need to be considered in order to advance the adoption of the CEP technology within BPM.

... But currently, some information systems of this type, still under research, are appearing. These are offering features of Continuous Assurance which cover the three components, meet the majority of the four levels of objectives and provide benefits of Continuous Assurance [7,14]. ...

Nowadays there is a need for real-time awareness to assure conformity of organizational transactions to increase their reliability and to mitigate organizational risk. In this context, Continuous Assurance has assumed an important role as a management goal and in ensuring improved effectiveness of organizations. Some information systems already support Continuous Assurance services, but disposable data require extra effort to make it useful for management purposes. Hence, this paper presents a model constituted by five dimensions aiming to evaluate an information system with Continuous Assurance services. Moreover, each dimension has some proposed metrics to guide the development of an evaluation tool.

  • Jeremiah Onunga Jeremiah Onunga

There are some similarities between Financial Statement Audit (FSA) and Information System Audit (ISA). FSA is an examination of the reliability and integrity of financial statement records whereas ISA is a review and evaluation of the controls, risks, and system development within an Information system infrastructure to determine if the information systems are safeguards to protect against abuse, safeguards assets, maintains data integrity, and operates effectively to achieve the organization's going concern objective. Decision makers need to ensure that the process of collecting and evaluating evidence of an organization's information systems, practices, and operations are reliable. Data manipulation can be caused by external or internal threat. Internal manipulation threat is the most dangerous one because it is committed by authorized personnel which make it very difficult to be detected. In particular, the framework introduces an anomaly detection technique, one of the data mining methods, to determine the suspected transactions arise from both internal and external threat. Once the suspected transactions are identified, procedures and monitoring control will be in place to minimize each threat. The proposed framework is expected to help both universities and ministry of higher education managers at all levels to make a vital decision based on reliable and accurate information in East Africa.

In a process-based organization, while actors perform their actions, multiple business transactions instances are running concurrently. Due to this raising complexity, during the business operation many risks may manifest when comparing the prescribed business transactions with its instances. The manifestation of a risk is a non-conformance behavior during the execution of transactions. Some risks are negative and should be avoided (or even revoked), while others are positive and could be incorporated in an evolvable risk pattern repository. By evolvable we mean the control systems' ability to accommodate, at run-time, new business transactions models and risks patterns. This paper conceptualizes a fine-grained approach to control the execution of business transaction and presents an instantiation of the concepts in a functional architecture, which may be applied to any enterprise information system (e.g. ERP). The conceptualization covers the existing concepts and relationships for feedforward and feedback control schemas. The instantiation shows how to implement both control schemas and describes the functional components required. Further, our approach is illustrated via a case study encompassing three enterprises in a simulated environment. Finally, a comparative discussion identifies the benefits of enforcing feedforward and feedback control schemas as a solution to guarantee the compliance of business processes execution.

Purpose – The paper aims to present a solution which makes it possible to control and audit organizational transactions in real time, helping to determine the degree of reliability with which they are carried out, mitigating the organizational risk. This auditing is made at a very low level of organizational transactions executed and supported exclusively in a digital format, contrary to what happens in most monitoring of transactions, which occurs at a high level. Moreover, it describes the conceptual architecture of the solution, its components and functionalities as well as the development and technical issues which should be taken into consideration on the deployment and evaluation of the solution. Design/methodology/approach – The work follows the design science methodology. It presents the problem and motivation of the investigation, the solution design and how it is being deployed. Furthermore, it presents the expected results based on the proposed architecture and on the results which are currently being achieved with the prototype implementation. Findings – The prototype is being put into practice, thus the gathering of results and their evaluation is not yet complete. However, preliminary results are really satisfactory and very close to those expected and enumerated. Originality/value – The research contributes to a new vision of organizational auditing focused on assurance services in transactions executed and supported in a digital format in compliance with the formalisms of a business ontological model of organizational transactions.

Based on evidence from press articles covering 39 corporate fraud cases that went public during the period 1992–2005, the objective of this article is to examine the role of managers' behavior in the commitment of the fraud. This study integrates the fraud triangle (FT) and the theory of planned behavior (TPB) to gain a better understanding of fraud cases. The results of the analysis suggest that personality traits appear to be a major fraud-risk factor. The analysis was further validated through a quantitative analysis of keywords which confirmed that keywords associated with the attitudes/rationalizations component of the integrated theory were predominately found in fraud firms as opposed to a sample of control firms. The results of the study suggest that auditors should evaluate the ethics of management through the components of the TPB: the assessment of attitude, subjective norms, perceived behavioral control and moral obligation. Therefore, it is potentially important that the professional standards that are related to fraud detection strengthen the emphasis on managers' behavior that may be associated with unethical behavior.

  • Amedeo Giorgi Amedeo Giorgi

This article points out the criteria necessary in order for a qualitative scientific method to qualify itself as phenomenological in a descriptive Husserlian sense. One would have to employ (1) description (2) within the attitude of the phenomenological reduction, and (3) seek the most invariant meanings for a context. The results of this analysis are used to critique an article by Klein and Westcott (1994), that presents a typology of the development of the phenomenological psychological method.