会员体验
专利管家(专利管理)
工作空间(专利管理)
风险监控(情报监控)
数据分析(专利分析)
侵权分析(诉讼无效)
联系我们
交流群
官方交流:
QQ群: 891211   
微信请扫码    >>>
现在联系顾问~
热词
    • 3. 发明授权
    • Cache spill management techniques using cache spill prediction
    • 缓存溢出管理技术使用缓存溢出预测
    • US08407421B2
    • 2013-03-26
    • US12639214
    • 2009-12-16
    • Simon C. Steely, Jr.William C. HasenplaughAamer JaleelGeorge Z. Chrysos
    • Simon C. Steely, Jr.William C. HasenplaughAamer JaleelGeorge Z. Chrysos
    • G06F12/00
    • G06F12/0806G06F12/12
    • An apparatus and method is described herein for intelligently spilling cache lines. Usefulness of cache lines previously spilled from a source cache is learned, such that later evictions of useful cache lines from a source cache are intelligently selected for spill. Furthermore, another learning mechanism—cache spill prediction—may be implemented separately or in conjunction with usefulness prediction. The cache spill prediction is capable of learning the effectiveness of remote caches at holding spilled cache lines for the source cache. As a result, cache lines are capable of being intelligently selected for spill and intelligently distributed among remote caches based on the effectiveness of each remote cache in holding spilled cache lines for the source cache.
    • 这里描述了用于智能地溢出高速缓存行的装置和方法。 了解先前从源缓存溢出的高速缓存行的有用性,从而智能地选择来自源缓存的随后驱逐的溢出。 此外,另一种学习机制 - 缓存溢出预测 - 可以单独实施或结合有用性预测来实现。 高速缓存溢出预测能够学习在为源缓存保留溢出的高速缓存行时远程高速缓存的有效性。 因此,基于每个远程高速缓存在保存用于源高速缓存的溢出高速缓存行的有效性的情况下,高速缓存行能够被智能地选择为溢出并且智能地分布在远程高速缓存中。
    • 4. 发明申请
    • CACHE SPILL MANAGEMENT TECHNIQUES
    • 缓存溢出管理技术
    • US20110145501A1
    • 2011-06-16
    • US12639214
    • 2009-12-16
    • Simon C. Steely, JR.William C. HasenplaughAamer JaleelGeorge Z. Chrysos
    • Simon C. Steely, JR.William C. HasenplaughAamer JaleelGeorge Z. Chrysos
    • G06F12/08G06F12/00
    • G06F12/0806G06F12/12
    • An apparatus and method is described herein for intelligently spilling cache lines. Usefulness of cache lines previously spilled from a source cache is learned, such that later evictions of useful cache lines from a source cache are intelligently selected for spill. Furthermore, another learning mechanism—cache spill prediction—may be implemented separately or in conjunction with usefulness prediction. The cache spill prediction is capable of learning the effectiveness of remote caches at holding spilled cache lines for the source cache. As a result, cache lines are capable of being intelligently selected for spill and intelligently distributed among remote caches based on the effectiveness of each remote cache in holding spilled cache lines for the source cache.
    • 这里描述了用于智能地溢出高速缓存行的装置和方法。 了解先前从源缓存溢出的高速缓存行的有用性,从而智能地选择来自源缓存的随后驱逐的溢出。 此外,另一种学习机制 - 缓存溢出预测 - 可以单独实施或结合有用性预测来实现。 高速缓存溢出预测能够学习在为源缓存保留溢出的高速缓存行时远程高速缓存的有效性。 因此,基于每个远程高速缓存在保存用于源高速缓存的溢出高速缓存行的有效性的情况下,高速缓存行能够被智能地选择为溢出并且智能地分布在远程高速缓存中。
    • 7. 发明申请
    • Instruction Prefetching Using Cache Line History
    • 使用缓存线历史记录进行指令预取
    • US20120084497A1
    • 2012-04-05
    • US12895387
    • 2010-09-30
    • Samantika SubramaniamAamer JaleelSimon C. Steely, JR.
    • Samantika SubramaniamAamer JaleelSimon C. Steely, JR.
    • G06F12/06G06F12/08
    • G06F12/0862G06F9/3816G06F2212/452G06F2212/6024Y02D10/13
    • An apparatus of an aspect includes a prefetch cache line address predictor to receive a cache line address and to predict a next cache line address to be prefetched. The next cache line address may indicate a cache line having at least 64-bytes of instructions. The prefetch cache line address predictor may have a cache line target history storage to store a cache line target history for each of multiple most recent corresponding cache lines. Each cache line target history may indicate whether the corresponding cache line had a sequential cache line target or a non-sequential cache line target. The cache line address predictor may also have a cache line target history predictor. The cache line target history predictor may predict whether the next cache line address is a sequential cache line address or a non-sequential cache line address, based on the cache line target history for the most recent cache lines.
    • 一方面的装置包括预取高速缓存行地址预测器,用于接收高速缓存行地址并预测要预取的下一个高速缓存行地址。 下一个高速缓存行地址可以指示具有至少64字节指令的高速缓存行。 预取高速缓存线地址预测器可以具有高速缓存行目标历史存储器,以存储多个最新对应的高速缓存行中的每一个的高速缓存行目标历史。 每个高速缓存行目标历史可以指示对应的高速缓存线是否具有顺序高速缓存行目标或非顺序高速缓存行目标。 高速缓存行地址预测器也可以具有高速缓存行目标历史预测器。 高速缓存行目标历史预测器可以基于最近的高速缓存行的高速缓存行目标历史来预测下一个高速缓存行地址是顺序高速缓存行地址还是非顺序高速缓存行地址。
    • 8. 发明授权
    • Instruction prefetching using cache line history
    • 指令预取使用高速缓存行历史记录
    • US08533422B2
    • 2013-09-10
    • US12895387
    • 2010-09-30
    • Samantika SubramaniamAamer JaleelSimon C. Steely, Jr.
    • Samantika SubramaniamAamer JaleelSimon C. Steely, Jr.
    • G06F12/06G06F12/08
    • G06F12/0862G06F9/3816G06F2212/452G06F2212/6024Y02D10/13
    • An apparatus of an aspect includes a prefetch cache line address predictor to receive a cache line address and to predict a next cache line address to be prefetched. The next cache line address may indicate a cache line having at least 64-bytes of instructions. The prefetch cache line address predictor may have a cache line target history storage to store a cache line target history for each of multiple most recent corresponding cache lines. Each cache line target history may indicate whether the corresponding cache line had a sequential cache line target or a non-sequential cache line target. The cache line address predictor may also have a cache line target history predictor. The cache line target history predictor may predict whether the next cache line address is a sequential cache line address or a non-sequential cache line address, based on the cache line target history for the most recent cache lines.
    • 一方面的装置包括预取高速缓存行地址预测器,用于接收高速缓存行地址并预测要预取的下一个高速缓存行地址。 下一个高速缓存行地址可以指示具有至少64字节指令的高速缓存行。 预取高速缓存线地址预测器可以具有高速缓存行目标历史存储器,以存储多个最新对应的高速缓存行中的每一个的高速缓存行目标历史。 每个高速缓存行目标历史可以指示对应的高速缓存线是否具有顺序高速缓存行目标或非顺序高速缓存行目标。 高速缓存行地址预测器也可以具有高速缓存行目标历史预测器。 高速缓存行目标历史预测器可以基于最近的高速缓存行的高速缓存行目标历史来预测下一个高速缓存行地址是顺序高速缓存行地址还是非顺序高速缓存行地址。