Commentary on Section 264 of ITAA 1936: commissioner may require information and evidence
Dabner, Justin
2007-01-01
Search results
1,869 records were found.
This paper describes version 2 of the NuSMV tool. NuSMV is a symbolic model checker originated from the reengineering, reimplementation and extension of SMV, the original BDD-based model checker developed at CMU. The NuSMV project aims at the development of a state-of-the-art symbolic model checker, designed to be applicable in technology transfer projects: it is a well structured, open, flexible and documented platform for model checking, and is robust and close to industrial systems standards.
This paper studies the tax payers “styles” of tax evasion. It starts from a brief theoretical formulation of the tax payer’s decisional problem which incorporates a psychological element into the usual expected utility maximisation approach. This psychological component is founded on the hypothesis that tax payers feel their awareness that they are stealing their contribution to the tax yield from the other citizens as a moral cost. The theoretical model was tested by carrying out three experiments involving 90 experimental subjects. The most important finding to emerge from the experiments is that the traditional theoretical treatment of uncertainty and risk could not be used to provide a satisfactory explanation of the experimental subjects’ behaviour when faced by the uncertain choice of evasion. When the experimental subjects had t...
We extend the P-calculus and the spi-calculus with two primitives that guarantee authentication. They enable us to abstract from various implementations/specifications of authentication, and to obtain idealized protocols which are ``secure by construction''. The main underlying idea, originally proposed in for entity authentication, is to use the locations of processes in order to check who is sending a message (authentication of a party) and who originated a message (message authentication). The theory of local names, developed in for the Pcalculus, gives us almost for free both the partner authentication and the message authentication primitives.
In the dynamic model presented in the paper manufacturing and service firms coexist. They use labor inputs provided by households which buy both the manufactured good and the service. The latter may differ in its quality depending on the effort level of the firms’ employees. Service firms must invest in reputation for quality. As the long-term equilibrium emerging in a competitive framework is characterized by unemployment, the imposition of a binding wage floor lowers employment in the service sector without affecting the employment level of the manufacturing sector: the wage differentials between the two sectors shrink and the quality level of the service improves, but unemployment increases. As the competitive solution leads the economy to a full-employment steady state, a binding but relatively low minimum wage may bring about a mo...
We study a deterministic model for the dynamics of a population infected by macroparasites. The model consists of an infnite system of partial differential equations, with initial and boundary conditions; the system is transformed in an abstract Cauchy problem on a suitable Banach space, and existence and uniqueness of the solution are obtained through multiplicative perturbation of a linear C0°-semigroup. Positivity and boundedness are proved using the specific form of the equations.
Structurally similar economies sharing the same wage-setting institutions tend to grow at the same rate, thus preserving their differences in levels of output per capita and employment rate, if the value of the workers’ outside options is rigid and equal across economies. In this case, i) multiple balanced growth paths can be possible, ii) the sustainable rate of growth is higher in economies with competitive wage determination than it is in unionized economies, and iii) this growth differential becomes larger in the presence of an integrated capital market. As the workers’ outside option depends on fiscal transfers responding to changes in levels of output and employment, there is convergence in levels across structurally and institutionally similar economies. In this case, i) economies with competitive wage setting converge at higher...
Current standards for video compression achieve good performances in terms of data compaction and signal to noise ratio of the decoded signal. Nevertheless, there are some known problems concerning the visual quality of reconstructed images, which can be partially solved using appropriate post-processing algorithms. The paper proposes a new adaptive anisotropic filter (AAF) that aims at unifying the treatment of different sources of perceptive distortion in MPEG sequences. The process is driven by a local classification of blocks and single pixels of decoded frames, taking into account several parameters (distribution of DCT coefficient energy, presence of sharp variations, spatial position of DCT block boundaries). Experimental results show that the proposed algorithm outperforms existing enhancement approaches, in particular when con...
Sir John R. Hicks, Nobel laureate in economics, once said that an economist who is only an economist is not a good economist. In times of high specialization and segmentation of knowledge, this view is probably out of fashion and regarded as dangerous. Moreover, economics and its clerks are generally considered prone to "cultural imperialism" by other social scientists. Since I regard John Hicks as a master of thought, but also disapprove of the imperialistic grasp of economics on all aspects of human life, I accepted to contribute to this conference as a valuable opportunity to investigate problems and exchange ideas outside the high walls of the economic citadel, in the spirit of a citizen of the world engaged in the search for more equitable, prosperous and peaceful human relations. Hence I have limited the "professional" economic p...
Adaptive model selection can be defined as the process thanks to which an optimal classifiers h* is automatically selected from a function class H by using only a given set of examples z. Such a process is particularly critic when the number of examples in z is low, because it is impossible the classical splitting of z in training + test + validation. In this work we show that the joined investigation of two bounds of the prediction error of the classifier can be useful to select h* by using z for both model selection and training. Our learning algorithm is a simple kernel-based Perceptron that can be easily implemented in a counter-based digital hardware. Experiments on two real world data sets show the validity of the proposed method.
