05 Fakultät Informatik, Elektrotechnik und Informationstechnik

Permanent URI for this collectionhttps://elib.uni-stuttgart.de/handle/11682/6

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    ItemOpen Access
    Models for internet of things environments : a survey
    (2020) Franco da Silva, Ana Cristina; Hirmer, Pascal
    Today, the Internet of Things (IoT) is an emerging topic in research and industry. Famous examples of IoT applications are smart homes, smart cities, and smart factories. Through highly interconnected devices, equipped with sensors and actuators, context-aware approaches can be developed to enable, e.g., monitoring and self-organization. To achieve context-awareness, a large amount of environment models have been developed for the IoT that contain information about the devices of an environment, their attached sensors and actuators, as well as their interconnection. However, these models highly differ in their content, the format being used, for example ontologies or relational models, and the domain to which they are applied. In this article, we present a comparative survey of models for IoT environments. By doing so, we describe and compare the selected models based on a deep literature research. The result is a comparative overview of existing state-of-the-art IoT environment models.
  • Thumbnail Image
    ItemOpen Access
    Analyzing the influence of hyper-parameters and regularizers of topic modeling in terms of Renyi entropy
    (2020) Koltcov, Sergei; Ignatenko, Vera; Boukhers, Zeyd; Staab, Steffen
    Topic modeling is a popular technique for clustering large collections of text documents. A variety of different types of regularization is implemented in topic modeling. In this paper, we propose a novel approach for analyzing the influence of different regularization types on results of topic modeling. Based on Renyi entropy, this approach is inspired by the concepts from statistical physics, where an inferred topical structure of a collection can be considered an information statistical system residing in a non-equilibrium state. By testing our approach on four models-Probabilistic Latent Semantic Analysis (pLSA), Additive Regularization of Topic Models (BigARTM), Latent Dirichlet Allocation (LDA) with Gibbs sampling, LDA with variational inference (VLDA)-we, first of all, show that the minimum of Renyi entropy coincides with the “true” number of topics, as determined in two labelled collections. Simultaneously, we find that Hierarchical Dirichlet Process (HDP) model as a well-known approach for topic number optimization fails to detect such optimum. Next, we demonstrate that large values of the regularization coefficient in BigARTM significantly shift the minimum of entropy from the topic number optimum, which effect is not observed for hyper-parameters in LDA with Gibbs sampling. We conclude that regularization may introduce unpredictable distortions into topic models that need further research.