numpy - Python package that supports weighted covariance computation -


Is there a Python statistical package that supports the calculation of weighted co-operation (i.e., is the weight of each observation)? Uncomfortable does not support numpy.cov weight

Working under a narrow / narrow framework (i.e., able to use numpy arrays to speed the calculation).

Thanks a lot!

Models of data is weighted on the data intensive calculations .

But we can still calculate it directly:

  # - * - Coding: UTF-8 - * - "" Descriptive statistics writer with case weight : Joseph Perktlod "" Do weightedststreet.watstats import Statsmodels.stats.weight DescrStatsW np.random.seed (987467) x = np.random.multivariate_normal ([0, 1.], [[1., 0.5], [0.5 1]], size = 20) weight = np.random.randint (1, 4, size = 20) xlong = np.repeat (x, weight, axis = 0) DS = DescrStatsW (x, weight = weight) Print To use 'cov statsmodels' print ds.cov copy expression itself = ds # china ds_cov = np.dot (self.weights * self.demeaned.T., Self.demeaned) / self.sum_weights print '\ nddof = 0' print ds_cov print np.cov (xlong.T, bias = 1) # directly Disiaisiovi 0 = NP. DOT (self. Vits * Self Dimneted. T. Self Dimaneted) / \ (Self. Yuemits - 1) print '\ nddof = 1' print Disaisiovi 0 Print NP Cov (xlong.T, bias = 0)  < / pre> 

thi s prints:

  cov statsmodels [[0.43671986 0.06551506] [0.06551506 0.66281218]] ddof = 0 [[0.43671986 0.06551506] [0.06551506 0.66281218]] [[0.43671986 0.06551506] [0.06551506 0.66281218]] ddof = 1 [[0.44821249 0.06723914] [0.06723914 0.68025461]] [[0.44821249 0.06723914] [0.06723914 0.68025461]]   

Editorial note

The initial answer has indicated a bug in the statistics model, which Uran has been fixed.

Comments

Popular posts from this blog

Python SQLAlchemy:AttributeError: Neither 'Column' object nor 'Comparator' object has an attribute 'schema' -

java - How not to audit a join table and related entities using Hibernate Envers? -

mongodb - CakePHP paginator ignoring order, but only for certain values -