Class: Statsample::Bivariate::Polychoric
- Inherits:
-
Object
- Object
- Statsample::Bivariate::Polychoric
- Includes:
- DirtyMemoize, Summarizable
- Defined in:
- lib/statsample/bivariate/polychoric.rb,
lib/statsample/bivariate/polychoric/processor.rb
Defined Under Namespace
Classes: Processor
Constant Summary collapse
- METHOD =
Default method
:two_step
- MAX_ITERATIONS =
Max number of iteratios
300
- EPSILON =
Epsilon
1e-6
- MINIMIZER_TYPE_TWO_STEP =
GSL unidimensional minimizer
"brent"
- MINIMIZER_TYPE_JOINT_DERIVATIVE =
GSL multidimensional minimizer, derivative based
"conjugate_pr"
- MINIMIZER_TYPE_JOINT_NO_DERIVATIVE =
GSL multidimensional minimizer, non derivative based
"nmsimplex"
Instance Attribute Summary collapse
-
#alpha ⇒ Object
(also: #threshold_x)
readonly
Returns the rows thresholds.
-
#beta ⇒ Object
(also: #threshold_y)
readonly
Returns the columns thresholds.
-
#debug ⇒ Object
Debug algorithm (See iterations, for example).
-
#epsilon ⇒ Object
Absolute error for iteration.
-
#failed ⇒ Object
readonly
Returns the value of attribute failed.
-
#iteration ⇒ Object
readonly
Number of iterations.
-
#log ⇒ Object
readonly
Log of algorithm.
-
#loglike_model ⇒ Object
readonly
Model ll.
-
#max_iterations ⇒ Object
Max number of iterations used on iterative methods.
-
#method ⇒ Object
Method of calculation of polychoric series.
-
#minimizer_type_joint_derivative ⇒ Object
Minimizer type for joint estimate, using derivative.
-
#minimizer_type_joint_no_derivative ⇒ Object
Minimizer type for joint estimate, no derivative.
-
#minimizer_type_two_step ⇒ Object
Minimizer type for two step.
-
#name ⇒ Object
Name of the analysis.
-
#r ⇒ Object
readonly
Returns the polychoric correlation.
Class Method Summary collapse
-
.new_with_vectors(v1, v2) ⇒ Object
Create a Polychoric object, based on two vectors.
Instance Method Summary collapse
-
#chi_square ⇒ Object
Chi Square of model.
- #chi_square_df ⇒ Object
-
#compute ⇒ Object
Start the computation of polychoric correlation based on attribute
method
. - #compute_basic_parameters ⇒ Object
-
#compute_derivatives_vector(v, df) ⇒ Object
:nodoc:.
-
#compute_one_step_mle ⇒ Object
Compute joint ML estimation.
-
#compute_one_step_mle_with_derivatives ⇒ Object
Compute Polychoric correlation with joint estimate, usign derivative based minimization method.
-
#compute_one_step_mle_without_derivatives ⇒ Object
Compute Polychoric correlation with joint estimate, usign derivative-less minimization method.
-
#compute_polychoric_series ⇒ Object
Compute polychoric correlation using polychoric series.
-
#compute_two_step_mle ⇒ Object
Computation of polychoric correlation usign two-step ML estimation.
-
#compute_two_step_mle_gsl ⇒ Object
Compute two step ML estimation using gsl.
-
#compute_two_step_mle_ruby ⇒ Object
Compute two step ML estimation using only ruby.
-
#expected ⇒ Object
:nodoc:.
-
#hermit(s, k) ⇒ Object
Computes vector h(mm7) of orthogonal hermite…
-
#initialize(matrix, opts = Hash.new) ⇒ Polychoric
constructor
Params: *
matrix
: Contingence table *opts
: Hash with options. -
#loglike_data ⇒ Object
:section: LL methods Retrieve log likehood for actual data.
-
#matrix_for_rho(rho) ⇒ Object
:nodoc:.
-
#report_building(generator) ⇒ Object
:nodoc:.
-
#xnorm(t) ⇒ Object
:nodoc:.
Constructor Details
#initialize(matrix, opts = Hash.new) ⇒ Polychoric
Params:
-
matrix
: Contingence table -
opts
: Hash with options. Could be any
accessable attribute of object
168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 |
# File 'lib/statsample/bivariate/polychoric.rb', line 168 def initialize(matrix, opts=Hash.new) @matrix=matrix @n=matrix.column_size @m=matrix.row_size raise "row size <1" if @m<=1 raise "column size <1" if @n<=1 @method=METHOD @name=_("Polychoric correlation") @max_iterations=MAX_ITERATIONS @epsilon=EPSILON @minimizer_type_two_step=MINIMIZER_TYPE_TWO_STEP @minimizer_type_joint_no_derivative=MINIMIZER_TYPE_JOINT_NO_DERIVATIVE @minimizer_type_joint_derivative=MINIMIZER_TYPE_JOINT_DERIVATIVE @debug=false @iteration=nil opts.each{|k,v| self.send("#{k}=",v) if self.respond_to? k } @r=nil @pd=nil @failed=false compute_basic_parameters end |
Instance Attribute Details
#alpha ⇒ Object (readonly) Also known as: threshold_x
Returns the rows thresholds
139 140 141 |
# File 'lib/statsample/bivariate/polychoric.rb', line 139 def alpha @alpha end |
#beta ⇒ Object (readonly) Also known as: threshold_y
Returns the columns thresholds
141 142 143 |
# File 'lib/statsample/bivariate/polychoric.rb', line 141 def beta @beta end |
#debug ⇒ Object
Debug algorithm (See iterations, for example)
102 103 104 |
# File 'lib/statsample/bivariate/polychoric.rb', line 102 def debug @debug end |
#epsilon ⇒ Object
Absolute error for iteration.
125 126 127 |
# File 'lib/statsample/bivariate/polychoric.rb', line 125 def epsilon @epsilon end |
#failed ⇒ Object (readonly)
Returns the value of attribute failed.
143 144 145 |
# File 'lib/statsample/bivariate/polychoric.rb', line 143 def failed @failed end |
#iteration ⇒ Object (readonly)
Number of iterations
128 129 130 |
# File 'lib/statsample/bivariate/polychoric.rb', line 128 def iteration @iteration end |
#log ⇒ Object (readonly)
Log of algorithm
131 132 133 |
# File 'lib/statsample/bivariate/polychoric.rb', line 131 def log @log end |
#loglike_model ⇒ Object (readonly)
Model ll
134 135 136 |
# File 'lib/statsample/bivariate/polychoric.rb', line 134 def loglike_model @loglike_model end |
#max_iterations ⇒ Object
Max number of iterations used on iterative methods. Default to MAX_ITERATIONS
100 101 102 |
# File 'lib/statsample/bivariate/polychoric.rb', line 100 def max_iterations @max_iterations end |
#method ⇒ Object
Method of calculation of polychoric series. :two_step
used by default.
- :two_step
-
two-step ML, based on code by J.Fox
- :polychoric_series
-
polychoric series estimate, using algorithm AS87 by Martinson and Hamdan (1975).
- :joint
-
one-step ML, usign derivatives by Olsson (1979)
123 124 125 |
# File 'lib/statsample/bivariate/polychoric.rb', line 123 def method @method end |
#minimizer_type_joint_derivative ⇒ Object
Minimizer type for joint estimate, using derivative. Default “conjugate_pr”. See rb-gsl.rubyforge.org/min.html for reference.
113 114 115 |
# File 'lib/statsample/bivariate/polychoric.rb', line 113 def minimizer_type_joint_derivative @minimizer_type_joint_derivative end |
#minimizer_type_joint_no_derivative ⇒ Object
Minimizer type for joint estimate, no derivative. Default “nmsimplex”. See rb-gsl.rubyforge.org/min.html for reference.
109 110 111 |
# File 'lib/statsample/bivariate/polychoric.rb', line 109 def minimizer_type_joint_no_derivative @minimizer_type_joint_no_derivative end |
#minimizer_type_two_step ⇒ Object
Minimizer type for two step. Default “brent” See rb-gsl.rubyforge.org/min.html for reference.
105 106 107 |
# File 'lib/statsample/bivariate/polychoric.rb', line 105 def minimizer_type_two_step @minimizer_type_two_step end |
#name ⇒ Object
Name of the analysis
98 99 100 |
# File 'lib/statsample/bivariate/polychoric.rb', line 98 def name @name end |
#r ⇒ Object (readonly)
Returns the polychoric correlation
137 138 139 |
# File 'lib/statsample/bivariate/polychoric.rb', line 137 def r @r end |
Class Method Details
.new_with_vectors(v1, v2) ⇒ Object
Create a Polychoric object, based on two vectors
161 162 163 |
# File 'lib/statsample/bivariate/polychoric.rb', line 161 def self.new_with_vectors(v1,v2) Polychoric.new(Crosstab.new(v1,v2).to_matrix) end |
Instance Method Details
#chi_square ⇒ Object
Chi Square of model
230 231 232 233 234 235 |
# File 'lib/statsample/bivariate/polychoric.rb', line 230 def chi_square if @loglike_model.nil? compute end -2*(@loglike_model-loglike_data) end |
#chi_square_df ⇒ Object
237 238 239 |
# File 'lib/statsample/bivariate/polychoric.rb', line 237 def chi_square_df (@nr*@nc)-@nc-@nr end |
#compute ⇒ Object
Start the computation of polychoric correlation based on attribute method
.
201 202 203 204 205 206 207 208 209 210 211 |
# File 'lib/statsample/bivariate/polychoric.rb', line 201 def compute if @method==:two_step compute_two_step_mle elsif @method==:joint compute_one_step_mle elsif @method==:polychoric_series compute_polychoric_series else raise "Not implemented" end end |
#compute_basic_parameters ⇒ Object
244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 |
# File 'lib/statsample/bivariate/polychoric.rb', line 244 def compute_basic_parameters @nr=@matrix.row_size @nc=@matrix.column_size @sumr=[0]*@matrix.row_size @sumrac=[0]*@matrix.row_size @sumc=[0]*@matrix.column_size @sumcac=[0]*@matrix.column_size @alpha=[0]*(@nr-1) @beta=[0]*(@nc-1) @total=0 @nr.times do |i| @nc.times do |j| @sumr[i]+=@matrix[i,j] @sumc[j]+=@matrix[i,j] @total+=@matrix[i,j] end end ac=0 (@nr-1).times do |i| @sumrac[i]=@sumr[i]+ac @alpha[i]=Distribution::Normal.p_value(@sumrac[i] / @total.to_f) ac=@sumrac[i] end ac=0 (@nc-1).times do |i| @sumcac[i]=@sumc[i]+ac @beta[i]=Distribution::Normal.p_value(@sumcac[i] / @total.to_f) ac=@sumcac[i] end end |
#compute_derivatives_vector(v, df) ⇒ Object
:nodoc:
362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 |
# File 'lib/statsample/bivariate/polychoric.rb', line 362 def compute_derivatives_vector(v,df) # :nodoc: new_rho=v[0] new_alpha=v[1, @nr-1] new_beta=v[@nr, @nc-1] if new_rho.abs>0.9999 new_rho= (new_rho>0) ? 0.9999 : -0.9999 end pr=Processor.new(new_alpha,new_beta,new_rho,@matrix) df[0]=-pr.fd_loglike_rho new_alpha.to_a.each_with_index {|v,i| df[i+1]=-pr.fd_loglike_a(i) } offset=new_alpha.size+1 new_beta.to_a.each_with_index {|v,i| df[offset+i]=-pr.fd_loglike_b(i) } end |
#compute_one_step_mle ⇒ Object
Compute joint ML estimation. Uses compute_one_step_mle_with_derivatives() by default.
382 383 384 |
# File 'lib/statsample/bivariate/polychoric.rb', line 382 def compute_one_step_mle compute_one_step_mle_with_derivatives end |
#compute_one_step_mle_with_derivatives ⇒ Object
Compute Polychoric correlation with joint estimate, usign derivative based minimization method.
Much faster than method without derivatives.
390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 |
# File 'lib/statsample/bivariate/polychoric.rb', line 390 def compute_one_step_mle_with_derivatives # Get initial values with two-step aproach compute_two_step_mle # Start iteration with past values rho=@r cut_alpha=@alpha cut_beta=@beta parameters=[rho]+cut_alpha+cut_beta np=@nc-1+@nr loglike_f = Proc.new { |v, params| new_rho=v[0] new_alpha=v[1, @nr-1] new_beta=v[@nr, @nc-1] pr=Processor.new(new_alpha,new_beta,new_rho,@matrix) pr.loglike } loglike_df = Proc.new {|v, params, df | compute_derivatives_vector(v,df) } my_func = GSL::MultiMin::Function_fdf.alloc(loglike_f,loglike_df, np) my_func.set_params(parameters) # parameters x = GSL::Vector.alloc(parameters.dup) minimizer = GSL::MultiMin::FdfMinimizer.alloc(minimizer_type_joint_derivative,np) minimizer.set(my_func, x, 1, 1e-3) iter = 0 ="" begin_time=Time.new begin iter += 1 status = minimizer.iterate() #p minimizer.f #p minimizer.gradient status = minimizer.test_gradient(1e-3) if status == GSL::SUCCESS total_time=Time.new-begin_time +="Joint MLE converged to minimum on %0.3f seconds at\n" % total_time end x = minimizer.x += sprintf("%5d iterations", iter)+"\n"; += "args=" for i in 0...np do +=sprintf("%10.3e ", x[i]) end +=sprintf("f() = %7.3f\n" , minimizer.f)+"\n"; rescue => e @failed=true end while status == GSL::CONTINUE and iter < @max_iterations @iteration=iter @log+= @r=minimizer.x[0] @alpha=minimizer.x[1,@nr-1].to_a @beta=minimizer.x[@nr,@nc-1].to_a @loglike_model= -minimizer.minimum pr=Processor.new(@alpha,@beta,@r,@matrix) end |
#compute_one_step_mle_without_derivatives ⇒ Object
Compute Polychoric correlation with joint estimate, usign derivative-less minimization method.
Rho and thresholds are estimated at same time. Code based on R package “polycor”, by J.Fox.
463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 |
# File 'lib/statsample/bivariate/polychoric.rb', line 463 def compute_one_step_mle_without_derivatives # Get initial values with two-step aproach compute_two_step_mle # Start iteration with past values rho=@r cut_alpha=@alpha cut_beta=@beta parameters=[rho]+cut_alpha+cut_beta np=@nc-1+@nr minimization = Proc.new { |v, params| new_rho=v[0] new_alpha=v[1, @nr-1] new_beta=v[@nr, @nc-1] #puts "f'rho=#{fd_loglike_rho(alpha,beta,rho)}" #(@nr-1).times {|k| # puts "f'a(#{k}) = #{fd_loglike_a(alpha,beta,rho,k)}" # puts "f'a(#{k}) v2 = #{fd_loglike_a2(alpha,beta,rho,k)}" # #} #(@nc-1).times {|k| # puts "f'b(#{k}) = #{fd_loglike_b(alpha,beta,rho,k)}" #} pr=Processor.new(new_alpha,new_beta,new_rho,@matrix) df=Array.new(np) #compute_derivatives_vector(v,df) pr.loglike } my_func = GSL::MultiMin::Function.alloc(minimization, np) my_func.set_params(parameters) # parameters x = GSL::Vector.alloc(parameters.dup) ss = GSL::Vector.alloc(np) ss.set_all(1.0) minimizer = GSL::MultiMin::FMinimizer.alloc(minimizer_type_joint_no_derivative,np) minimizer.set(my_func, x, ss) iter = 0 ="" begin_time=Time.new begin iter += 1 status = minimizer.iterate() status = minimizer.test_size(@epsilon) if status == GSL::SUCCESS total_time=Time.new-begin_time ="Joint MLE converged to minimum on %0.3f seconds at\n" % total_time end x = minimizer.x += sprintf("%5d iterations", iter)+"\n"; for i in 0...np do +=sprintf("%10.3e ", x[i]) end +=sprintf("f() = %7.3f size = %.3f\n", minimizer.fval, minimizer.size)+"\n"; end while status == GSL::CONTINUE and iter < @max_iterations @iteration=iter @log+= @r=minimizer.x[0] @alpha=minimizer.x[1,@nr-1].to_a @beta=minimizer.x[@nr,@nc-1].to_a @loglike_model= -minimizer.minimum end |
#compute_polychoric_series ⇒ Object
Compute polychoric correlation using polychoric series. Algorithm: AS87, by Martinson and Hamdam(1975).
Warning: According to Drasgow(2006), this computation diverges greatly of joint and two-step methods.
577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 |
# File 'lib/statsample/bivariate/polychoric.rb', line 577 def compute_polychoric_series @nn=@n-1 @mm=@m-1 @nn7=7*@nn @mm7=7*@mm @mn=@n*@m @cont=[nil] @n.times {|j| @m.times {|i| @cont.push(@matrix[i,j]) } } pcorl=0 cont=@cont xmean=0.0 sum=0.0 row=[] colmn=[] (1..@m).each do |i| row[i]=0.0 l=i (1..@n).each do |j| row[i]=row[i]+cont[l] l+=@m end raise "Should not be empty rows" if(row[i]==0.0) xmean=xmean+row[i]*i.to_f sum+=row[i] end xmean=xmean/sum.to_f ymean=0.0 (1..@n).each do |j| colmn[j]=0.0 l=(j-1)*@m (1..@m).each do |i| l=l+1 colmn[j]=colmn[j]+cont[l] #12 end raise "Should not be empty cols" if colmn[j]==0 ymean=ymean+colmn[j]*j.to_f end ymean=ymean/sum.to_f covxy=0.0 (1..@m).each do |i| l=i (1..@n).each do |j| conxy=covxy+cont[l]*(i.to_f-xmean)*(j.to_f-ymean) l=l+@m end end chisq=0.0 (1..@m).each do |i| l=i (1..@n).each do |j| chisq=chisq+((cont[l]**2).quo(row[i]*colmn[j])) l=l+@m end end phisq=chisq-1.0-(@mm*@nn).to_f / sum.to_f phisq=0 if(phisq<0.0) # Compute cumulative sum of columns and rows sumc=[] sumr=[] sumc[1]=colmn[1] sumr[1]=row[1] cum=0 (1..@nn).each do |i| # goto 17 r20 cum=cum+colmn[i] sumc[i]=cum end cum=0 (1..@mm).each do |i| cum=cum+row[i] sumr[i]=cum end alpha=[] beta=[] # Compute points of polytomy (1..@mm).each do |i| #do 21 alpha[i]=Distribution::Normal.p_value(sumr[i] / sum.to_f) end # 21 (1..@nn).each do |i| #do 22 beta[i]=Distribution::Normal.p_value(sumc[i] / sum.to_f) end # 21 @alpha=alpha[1,alpha.size] @beta=beta[1,beta.size] @sumr=row[1,row.size] @sumc=colmn[1,colmn.size] @total=sum # Compute Fourier coefficients a and b. Verified h=hermit(alpha,@mm) hh=hermit(beta,@nn) a=[] b=[] if @m!=2 # goto 24 mmm=@m-2 (1..mmm).each do |i| #do 23 a1=sum.quo(row[i+1] * sumr[i] * sumr[i+1]) a2=sumr[i] * xnorm(alpha[i+1]) a3=sumr[i+1] * xnorm(alpha[i]) l=i (1..7).each do |j| #do 23 a[l]=Math::sqrt(a1.quo(j))*(h[l+1] * a2 - h[l] * a3) l=l+@mm end end #23 end # 24 if @n!=2 # goto 26 nnn=@n-2 (1..nnn).each do |i| #do 25 a1=sum.quo(colmn[i+1] * sumc[i] * sumc[i+1]) a2=sumc[i] * xnorm(beta[i+1]) a3=sumc[i+1] * xnorm(beta[i]) l=i (1..7).each do |j| #do 25 b[l]=Math::sqrt(a1.quo(j))*(a2 * hh[l+1] - a3*hh[l]) l=l+@nn end # 25 end # 25 end #26 r20 l = @mm a1 = -sum * xnorm(alpha[@mm]) a2 = row[@m] * sumr[@mm] (1..7).each do |j| # do 27 a[l]=a1 * h[l].quo(Math::sqrt(j*a2)) l=l+@mm end # 27 l = @nn a1 = -sum * xnorm(beta[@nn]) a2 = colmn[@n] * sumc[@nn] (1..7).each do |j| # do 28 b[l]=a1 * hh[l].quo(Math::sqrt(j*a2)) l = l + @nn end # 28 rcof=[] # compute coefficients rcof of polynomial of order 8 rcof[1]=-phisq (2..9).each do |i| # do 30 rcof[i]=0.0 end #30 m1=@mm (1..@mm).each do |i| # do 31 m1=m1+1 m2=m1+@mm m3=m2+@mm m4=m3+@mm m5=m4+@mm m6=m5+@mm n1=@nn (1..@nn).each do |j| # do 31 n1=n1+1 n2=n1+@nn n3=n2+@nn n4=n3+@nn n5=n4+@nn n6=n5+@nn rcof[3] = rcof[3] + a[i]**2 * b[j]**2 rcof[4] = rcof[4] + 2.0 * a[i] * a[m1] * b[j] * b[n1] rcof[5] = rcof[5] + a[m1]**2 * b[n1]**2 + 2.0 * a[i] * a[m2] * b[j] * b[n2] rcof[6] = rcof[6] + 2.0 * (a[i] * a[m3] * b[j] * b[n3] + a[m1] * a[m2] * b[n1] * b[n2]) rcof[7] = rcof[7] + a[m2]**2 * b[n2]**2 + 2.0 * (a[i] * a[m4] * b[j] * b[n4] + a[m1] * a[m3] * b[n1] * b[n3]) rcof[8] = rcof[8] + 2.0 * (a[i] * a[m5] * b[j] * b[n5] + a[m1] * a[m4] * b[n1] * b[n4] + a[m2] * a[m3] * b[n2] * b[n3]) rcof[9] = rcof[9] + a[m3]**2 * b[n3]**2 + 2.0 * (a[i] * a[m6] * b[j] * b[n6] + a[m1] * a[m5] * b[n1] * b[n5] + (a[m2] * a[m4] * b[n2] * b[n4])) end # 31 end # 31 rcof=rcof[1,rcof.size] poly = GSL::Poly.alloc(rcof) roots=poly.solve rootr=[nil] rooti=[nil] roots.each {|c| rootr.push(c.real) rooti.push(c.im) } @rootr=rootr @rooti=rooti norts=0 (1..7).each do |i| # do 43 next if rooti[i]!=0.0 if (covxy>=0.0) next if(rootr[i]<0.0 or rootr[i]>1.0) pcorl=rootr[i] norts=norts+1 else if (rootr[i]>=-1.0 and rootr[i]<0.0) pcorl=rootr[i] norts=norts+1 end end end # 43 raise "Error" if norts==0 @r=pcorl pr=Processor.new(@alpha,@beta,@r,@matrix) @loglike_model=-pr.loglike end |
#compute_two_step_mle ⇒ Object
Computation of polychoric correlation usign two-step ML estimation.
Two-step ML estimation “first estimates the thresholds from the one-way marginal frequencies, then estimates rho, conditional on these thresholds, via maximum likelihood” (Uebersax, 2006).
The algorithm is based on code by Gegenfurtner(1992).
References:
-
Gegenfurtner, K. (1992). PRAXIS: Brent’s algorithm for function minimization. Behavior Research Methods, Instruments & Computers, 24(4), 560-564. Available on www.allpsych.uni-giessen.de/karl/pdf/03.praxis.pdf
-
Uebersax, J.S. (2006). The tetrachoric and polychoric correlation coefficients. Statistical Methods for Rater Agreement web site. 2006. Available at: john-uebersax.com/stat/tetra.htm . Accessed February, 11, 2010
287 288 289 290 291 292 293 |
# File 'lib/statsample/bivariate/polychoric.rb', line 287 def compute_two_step_mle if Statsample.has_gsl? compute_two_step_mle_gsl else compute_two_step_mle_ruby end end |
#compute_two_step_mle_gsl ⇒ Object
Compute two step ML estimation using gsl.
319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 |
# File 'lib/statsample/bivariate/polychoric.rb', line 319 def compute_two_step_mle_gsl fn1=GSL::Function.alloc {|rho| pr=Processor.new(@alpha,@beta, rho, @matrix) pr.loglike } @iteration = 0 max_iter = @max_iterations m = 0 # initial guess m_expected = 0 a=-0.9999 b=+0.9999 gmf = GSL::Min::FMinimizer.alloc(@minimizer_type_two_step) gmf.set(fn1, m, a, b) header=_("Two step minimization using %s method\n") % gmf.name header+=sprintf("%5s [%9s, %9s] %9s %10s %9s\n", "iter", "lower", "upper", "min", "err", "err(est)") header+=sprintf("%5d [%.7f, %.7f] %.7f %+.7f %.7f\n", @iteration, a, b, m, m - m_expected, b - a) @log=header puts header if @debug begin @iteration += 1 status = gmf.iterate status = gmf.test_interval(@epsilon, 0.0) if status == GSL::SUCCESS @log+="converged:" puts "converged:" if @debug end a = gmf.x_lower b = gmf.x_upper m = gmf.x_minimum =sprintf("%5d [%.7f, %.7f] %.7f %+.7f %.7f\n", @iteration, a, b, m, m - m_expected, b - a); @log+= puts if @debug end while status == GSL::CONTINUE and @iteration < @max_iterations @r=gmf.x_minimum @loglike_model=-gmf.f_minimum end |
#compute_two_step_mle_ruby ⇒ Object
Compute two step ML estimation using only ruby.
297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 |
# File 'lib/statsample/bivariate/polychoric.rb', line 297 def compute_two_step_mle_ruby #:nodoc: f=proc {|rho| pr=Processor.new(@alpha,@beta, rho, @matrix) pr.loglike } @log=_("Two step minimization using GSL Brent method (pure ruby)\n") min=Minimization::Brent.new(-0.9999,0.9999,f) min.epsilon=@epsilon min.expected=0 min.iterate @log+=min.log.to_table.to_s @r=min.x_minimum @loglike_model=-min.f_minimum @iteration=min.iterations puts @log if @debug end |
#expected ⇒ Object
:nodoc:
546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 |
# File 'lib/statsample/bivariate/polychoric.rb', line 546 def expected # :nodoc: rt=[] ct=[] t=0 @matrix.row_size.times {|i| @matrix.column_size.times {|j| rt[i]=0 if rt[i].nil? ct[j]=0 if ct[j].nil? rt[i]+=@matrix[i,j] ct[j]+=@matrix[i,j] t+=@matrix[i,j] } } m=[] @matrix.row_size.times {|i| row=[] @matrix.column_size.times {|j| row[j]=(rt[i]*ct[j]).quo(t) } m.push(row) } Matrix.rows(m) end |
#hermit(s, k) ⇒ Object
Computes vector h(mm7) of orthogonal hermite…
801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 |
# File 'lib/statsample/bivariate/polychoric.rb', line 801 def hermit(s,k) # :nodoc: h=[] (1..k).each do |i| # do 14 l=i ll=i+k lll=ll+k h[i]=1.0 h[ll]=s[i] v=1.0 (2..6).each do |j| #do 14 w=Math::sqrt(j) h[lll]=(s[i]*h[ll] - v*h[l]).quo(w) v=w l=l+k ll=ll+k lll=lll+k end end h end |
#loglike_data ⇒ Object
:section: LL methods Retrieve log likehood for actual data.
215 216 217 218 219 220 221 222 223 224 225 226 227 |
# File 'lib/statsample/bivariate/polychoric.rb', line 215 def loglike_data loglike=0 @nr.times do |i| @nc.times do |j| res=@matrix[i,j].quo(@total) if (res==0) res=1e-16 end loglike+= @matrix[i,j] * Math::log(res ) end end loglike end |
#matrix_for_rho(rho) ⇒ Object
:nodoc:
530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 |
# File 'lib/statsample/bivariate/polychoric.rb', line 530 def matrix_for_rho(rho) # :nodoc: pd=@nr.times.collect{ [0]*@nc} pc=@nr.times.collect{ [0]*@nc} @nr.times { |i| @nc.times { |j| pd[i][j]=Distribution::BivariateNormal.cdf(@alpha[i], @beta[j], rho) pc[i][j] = pd[i][j] pd[i][j] = pd[i][j] - pc[i-1][j] if i>0 pd[i][j] = pd[i][j] - pc[i][j-1] if j>0 pd[i][j] = pd[i][j] + pc[i-1][j-1] if (i>0 and j>0) res= pd[i][j] } } Matrix.rows(pc) end |
#report_building(generator) ⇒ Object
:nodoc:
825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 |
# File 'lib/statsample/bivariate/polychoric.rb', line 825 def report_building(generator) # :nodoc: compute if dirty? section=ReportBuilder::Section.new(:name=>@name) if @failed section.add("Failed to converge") else t=ReportBuilder::Table.new(:name=>_("Contingence Table"), :header=>[""]+(@n.times.collect {|i| "Y=#{i}"})+["Total"]) @m.times do |i| t.row(["X = #{i}"]+(@n.times.collect {|j| @matrix[i,j]}) + [@sumr[i]]) end t.hr t.row(["T"]+(@n.times.collect {|j| @sumc[j]})+[@total]) section.add(t) section.add(sprintf("r: %0.4f",r)) t=ReportBuilder::Table.new(:name=>_("Thresholds"), :header=>["","Value"]) threshold_x.each_with_index {|val,i| t.row([_("Threshold X %d") % i, sprintf("%0.4f", val)]) } threshold_y.each_with_index {|val,i| t.row([_("Threshold Y %d") % i, sprintf("%0.4f", val)]) } section.add(t) section.add(_("Iterations: %d") % @iteration) if @iteration section.add(_("Test of bivariate normality: X^2 = %0.3f, df = %d, p= %0.5f" % [ chi_square, chi_square_df, 1-Distribution::ChiSquare.cdf(chi_square, chi_square_df)])) generator.parse_element(section) end end |
#xnorm(t) ⇒ Object
:nodoc:
821 822 823 |
# File 'lib/statsample/bivariate/polychoric.rb', line 821 def xnorm(t) # :nodoc: Math::exp(-0.5 * t **2) * (1.0/Math::sqrt(2*Math::PI)) end |