About 194,000 results
Open links in new tab
  1. intuition - What is perplexity? - Cross Validated

    So perplexity represents the number of sides of a fair die that when rolled, produces a sequence with the same entropy as your given probability distribution. Number of States OK, so now that …

  2. 如何评价perplexity ai,会是未来搜索的趋势吗? - 知乎

    Perplexity AI 不是搜索的终点,但可能是我们逃离“信息垃圾场”的起点。 它就像是搜索引擎界的 GPT-4:懂你说什么,还知道去哪儿找答案。

  3. 求通俗解释NLP里的perplexity是什么? - 知乎

    所以在给定输入的前面若干词汇即给定历史信息后,当然语言模型等可能性输出的结果个数越少越好,越少表示模型就越知道对给定的历史信息 \ {e_1\cdots e_ {i-1}\} ,应该给出什么样的输出 …

  4. 如何评价 Perplexity 消除了 DeepSeek 的审查以提供 ... - 知乎

    如何评价 Perplexity 消除了 DeepSeek 的审查以提供公正、准确的回答? Perplexity: 我们很高兴地宣布,全新 DeepSeek R1 模型现已在所有 Perplexity 平台上线。

  5. perplexity.ai 用于科研体验如何? - 知乎

    Perplexity的快速模型、克劳德4.0、GPT-4.1、双子座2.5专业版、Grok3测试版、Perplexity的无偏见推理模型、OpenAl的最新推理模型。 我用他给自己算了一挂:请你作为一个算命大师,帮 …

  6. Finding the perplexity of multiple examples - Cross Validated

    Nov 12, 2020 · I am trying to find a way to calculate perplexity of a language model of multiple 3-word examples from my test set, or perplexity of the corpus of the test set. As the test set, I …

  7. 知乎 - 有问题,就会有答案

    知乎,中文互联网高质量的问答社区和创作者聚集的原创内容平台,于 2011 年 1 月正式上线,以「让人们更好的分享知识、经验和见解,找到自己的解答」为品牌使命。知乎凭借认真、专业 …

  8. machine learning - Why does lower perplexity indicate better ...

    The perplexity, used by convention in language modeling, is monotonically decreasing in the likelihood of the test data, and is algebraicly equivalent to the inverse of the geometric mean …

  9. text mining - How to calculate perplexity of a holdout with Latent ...

    I'm confused about how to calculate the perplexity of a holdout sample when doing Latent Dirichlet Allocation (LDA). The papers on the topic breeze over it, making me think I'm missing …

  10. Perplexity calculation in variational neural topic models

    The authors say that: Since log p(X) log p (X) is intractable in the NVDM, we use the variational lower bound (which is an upper bound on perplexity) to compute the perplexity following Mnih …

Refresh