“I had the pleasure to work with Niels as part of a Knowledge Exchange project between the University of Aberdeen and the Scottish Environment Protection Agency (SEPA). Niels was in charge of developing a Semantic Web software tool to enable non-academic users to access academic research outputs. He is a very motivated individual an excellent software developer and computer scientist. As part of this project Niels has also made substantial contributions to the wider Semantic Web community by sharing his experience online. ”
Niels Christensen
København, Region Hovedstaden, Danmark
615 følgere
500+ forbindelser
Om
Available for consulting in software development: Scoping, planning and delivering…
Erfaring
-
Elfin ApS
-
-
Licenser og certificeringer
Udgivelser
-
Levin’s and Schnorr’s optimality results
Speedup in Computational Complexity
This invited guest blog post focuses on Leonid Levin's Optimal Search Theorem. The theorem is one of few general results proving that it is not in vain to strive for the fastest solution to a computational task. In contrast to the "negative" result known as Blum's Speedup Theorem, Levin shows that a lot of tasks does have an optimal program. Or, if you prefer that perspective, it points to a severe problem in the concept of big-O-optimality.
-
Computational Models with No Linear Speedup
Chicago Journal of Theoretical Computer Science
The linear speedup theorem states, informally, that constants do not matter: It is essentially always possible to find a program solving any decision problem a factor of 2 faster. This result is a classical theorem in computing, but also one of the most debated. The main ingredient of the typical proof of the linear speedup theorem is tape compression, where a fast machine is constructed with tape alphabet or number of tapes far greater than that of the original machine. In this paper, we prove…
The linear speedup theorem states, informally, that constants do not matter: It is essentially always possible to find a program solving any decision problem a factor of 2 faster. This result is a classical theorem in computing, but also one of the most debated. The main ingredient of the typical proof of the linear speedup theorem is tape compression, where a fast machine is constructed with tape alphabet or number of tapes far greater than that of the original machine. In this paper, we prove that limiting Turing machines to a fixed alphabet and a fixed number of tapes rules out linear speedup. Specifically, we describe a language that can be recognized in linear time (e. g., 1.51n), and provide a proof, based on Kolmogorov complexity, that the computation cannot be sped up (e. g., below 1.49n). Without the tape and alphabet limitation, the linear speedup theorem does hold and yields machines of time complexity of the form (1+ε)n for arbitrarily small ε > 0.
Earlier results negating linear speedup in alternative models of computation have often been based on the existence of very efficient universal machines. In the vernacular of programming language theory: These models have very efficient self-interpreters. As the second contribution of this paper, we define a class, PICSTI, of computation models that exactly captures this property, and we disprove the Linear Speedup Theorem for every model in this class, thus generalizing all similar, model-specific proofs.Andre forfattereSe udgivelse -
Preserving the bits of the Danish Internet
5th International Web Archiving Workshop (IWAW05)
In one year, 98% of the atoms in your body will have left you. Yet your memories will remain with you. It is the structures and their interactions that makes our memories remain. The same principle applies to digital storage.
This paper shows how the design of a digital repository can be quantitatively related to its longevity. I define a programmatic, probabilistic model of hardware failures and repair operations in a digital repository. The mean time to failure of this model is then…In one year, 98% of the atoms in your body will have left you. Yet your memories will remain with you. It is the structures and their interactions that makes our memories remain. The same principle applies to digital storage.
This paper shows how the design of a digital repository can be quantitatively related to its longevity. I define a programmatic, probabilistic model of hardware failures and repair operations in a digital repository. The mean time to failure of this model is then computed in a number of experiments based on simulation
Keywords: web archiving, bit preservation, mean time to failure, simulation of probabilistic models.
Projekter
-
Project Chronos Prototype API and Linked Data
My main role has been to guide the integration to Google Cloud and the semantic web storage.
Andre skabereSe projekt -
PLANETS-EU
–
Pan-European project that aimed and developing an operational, yet generic pipeline for digital preservation.
-
NetarchiveSuite
–
Open Source software project with a complete toolset for doing web archiving.
Andre skabereSe projekt
Sprog
-
English
Komplet professionel færdighed
-
Danish
Modersmåls- eller tosprogsfærdighed
Anbefalinger, som du har modtaget
4 personer har anbefalet Niels
Tilmeld dig for at se hvemAndre lignende profiler
Andre med navnet Niels Christensen i Danmark
-
Niels Halse
FVP, Head of Fraud Management Danske Bank
-
Niels Christensen
-
Niels Christensen
tlf: +45-51669093 | [email protected]
-
Niels Christensen
Senior Infrastructure Engineer hos Danfoss
375 andre med navnet Niels Christensen i Danmark er på LinkedIn
Se andre med navnet Niels Christensen