会员体验
专利管家(专利管理)
工作空间(专利管理)
风险监控(情报监控)
数据分析(专利分析)
侵权分析(诉讼无效)
联系我们
交流群
官方交流:
QQ群: 891211   
微信请扫码    >>>
现在联系顾问~
热词
    • 4. 发明授权
    • Systems and methods for large scale global entity resolution
    • 大规模全球实体解决方案的系统和方法
    • US09311301B1
    • 2016-04-12
    • US14750936
    • 2015-06-25
    • Digital Reasoning Systems, Inc.
    • Vishnuvardhan BalluruKenneth GrahamNaomi Hilliard
    • G06F17/28G06N99/00G06N7/00G06F17/27
    • G06F17/278G06F17/271G06F17/2785G06N5/022G06N7/005G06N99/005
    • Systems and methods for coreference resolution are disclosed. In one embodiment, a method includes locating, for each of a selected plurality of chains of coreferent mentions, a particular context-based name from the respective chain, wherein the coreferent mentions correspond to entities and the context-based name is a longest name in the respective chain, a last name in the respective chain, or a most frequently occurring name in the respective chain. The method also includes determining an entity category for each respective one of the plurality of chains and determining one or more entity attributes from structured data and unstructured data. The method further includes, based on the located particular context-based name, the entity category, and the one or more attributes, assigning high-probability coreferent chains to high-confidence buckets, such as to produce a Zipfian-like distribution having a head region and a tail region.
    • 披露了解决方案的系统和方法。 在一个实施例中,一种方法包括针对不同提及的所选择的多个链中的每一个针对来自相应链的特定基于上下文的名称定位,其中所述不同提及对应于实体,并且基于上下文的名称是最长名称 相应链,相应链中的姓氏,或相应链中最常出现的名称。 所述方法还包括为所述多个链中的每一个链确定实体类别,并从结构化数据和非结构化数据确定一个或多个实体属性。 该方法还包括基于定位的特定的基于上下文的名称,实体类别和一个或多个属性,将高概率的不同链分配给高置信度桶,以产生具有头的Zipfian样分布 区域和尾部区域。
    • 10. 发明申请
    • Systems and Methods for Neural Language Modeling
    • 神经语言建模系统与方法
    • US20160247061A1
    • 2016-08-25
    • US15047532
    • 2016-02-18
    • Digital Reasoning Systems, Inc.
    • Andrew TraskDavid GilmoreMatthew Russell
    • G06N3/04G06N3/08
    • G06N3/04G06F17/2715G06F17/2785G06N3/02G06N3/0454G06N3/08
    • In some aspects, the present disclosure relates to neural language modeling. In one embodiment, a computer-implemented neural network includes a plurality of neural nodes, where each of the neural nodes has a plurality of input weights corresponding to a vector of real numbers. The neural network also includes an input neural node corresponding to a linguistic unit selected from an ordered list of a plurality of linguistic units, and an embedding layer with a plurality of embedding node partitions. Each embedding node partition includes one or more neural nodes. Each of the embedding node partitions corresponds to a position in the ordered list relative to a focus term, is configured to receive an input from an input node, and is configured to generate an output. The neural network also includes a classifier layer with a plurality of neural nodes, each configured to receive the embedding outputs from the embedding layer, and configured to generate an output corresponding to a probability that a particular linguistic unit is the focus term.
    • 在一些方面,本公开涉及神经语言建模。 在一个实施例中,计算机实现的神经网络包括多个神经节点,其中每个神经节点具有对应于实数向量的多个输入权重。 神经网络还包括对应于从多个语言单元的有序列表中选择的语言单元的输入神经元节点和具有多个嵌入节点分区的嵌入层。 每个嵌入节点分区包括一个或多个神经节点。 每个嵌入节点分区对应于有序列表中相对于聚焦项的位置,被配置为从输入节点接收输入,并且被配置为生成输出。 神经网络还包括具有多个神经节点的分类器层,每个神经节点被配置为从嵌入层接收嵌入输出,并且被配置为生成对应于特定语言单元是焦点项的概率的输出。