Protected: GD

This content is password protected. To view it please enter your password below:

Advertisements

Aruvi(2017) — review

Movie: Aruvi

Brilliance of screenplay irony:

  • — The scene at the shooting set after the shooting, fights with that cheater, exploitative guy.
    The irony arises from the numbness/indifference of the actress, and the way she makes puppets out
    of the shooting crew, which for a change have to face real life-or-death drama rather than the
    made-up things they shoot. Not to mention the background bass.

  • — The idealist wannabe director’s impractical dream story to direct.

  • — The role reversal of the prima donna actress serving tea.

  • — Ofcourse, the mock-tv show, of the actress.

Some things about the movie that left me cold though
* — The manner how the protagonist got HIV infection is a rather low probabilty and rare method(it had to go from the live blood of coconut seller to, any bleeding gums in the protagonist mouth or other bleeding wounds in the digestive tract), and the director had to reach for it to avoid the pre-marital sex taboo culture. I personally think it is a meaningless one to stick to. (Not that i suggest we let US style marketing and companies use dating and sex as a lure. Ironically, sticking to cultural excuses would drive some youth to that approach only. )

I’m not giving up!”  I raised my voice, angry, surprised at myself for being angry.  I took a breath, forced myself to return to a normal volume, “I’m saying there’s probably no fucking way I’ll understand why she did what she did.  So why waste my time and energy dwelling on it?  Fuck her, she doesn’t deserve the amount of attention I’ve been paying her. I’m… reprioritizing.”

She’s a bully,” I said.  “At the end of the day, she only wants to fight opponents she knows she can beat.”

“I’ve fought two Endbringers,” Shadow Stalker said, stabbing a finger in my direction.  “I know what you’re trying to do.  Fucking manipulating me, getting me into a dangerous situation where you’ll get me killed.  Fuck you.”

Gaussian Mixture Models.. GMMs

Gaussian Mixture Models

  • A probabilistic model
  • Assumes all data points are generated from a mixture of finite no. of gaussian
    distributions
  • The parameters of the gaussian distributions are unknown.
  • It is a way of generalizing k-mean(or k-medoid or k-mode for that matter) clustering to use the
    co-variance structure/stats as well as the mean/central-tendency measures of latent
    gaussians.

scikit-learn

Pros:

  • Fastest for learning mixture models
  • No bias of means towards zero, or bias cluster sizes to have specific structures

Cons:

  • When there’s not enough points per mixture, estimating covariance matrices becomes
    difficult
  • Number of components; will always use all the components it has access to, so might need
    missing or test-reserved data..

  • No. of components can be chosen based on BIC criterion.

  • Variational Bayesian Gaussian mixture avoids having to specify number of components

Variational Bayesian Gaussian Mixture

Fitting a Gaussian model to data

Share:Worm

A similar problem on a smaller scale.  I can walk through minutes, I could have walked back to save them, but I let them die because it meant a monster would remain gone.  What merit is a gamble, a sacrifice, if you stake things that matter nothing to you?

Share: Worm

have been doing this for ten years.  I admire you for retaining your…” he trailed off.

“Idealism?”

“Not a word I’m familiar with, Weaver.  Faith?”

“Faith works.”

“I have none left, after ten years.  No faith.  We are a wretched, petty species, and we have been given power to destroy ourselves with

Factor Analysis — notes

Factor Analysis:

Multiple Classifications:

Aka [Dimensionality reduction](https://en.wikipedia.org/wiki/Dimensionality_reduction)
Aka
[Dimensionality Estimation] (http://disco.ethz.ch/lectures/fs11/seminar/paper/samuel-1.pdf)
###  Methods:
    * [Intrinsic Dimension Estimation](https://www.stat.berkeley.edu/~bickel/mldim.pdf)
      or (http://www.sciencedirect.com/science/article/pii/S0020025515006179)
    * [PCA](http://www.music.mcgill.ca/~ich/classes/mumt611_07/classifiers/lda_theory.pdf)
    * [Kernel-PCA](http://papers.nips.cc/paper/1491-kernel-pca-and-de-noising-in-feature-spaces.pdf)
    * [Graph-based kernel PCA](http://ieeexplore.ieee.org/abstract/document/1261097/)
    * [Linear Discriminant Analysis](http://www.music.mcgill.ca/~ich/classes/mumt611_07/classifiers/lda_theory.pdfhttp://www.music.mcgill.ca/~ich/classes/mumt611_07/classifiers/lda_theory.pdf)
    * [Generalized Discriminant Analysis](http://www.jmlr.org/papers/v6/ye05a.html)
    * [Manifold Learning] (http://scikit-learn.org/stable/modules/manifold.html)
### Factor Analysis(based on goal):

    *  [Exploratory Factor
       Analysis](https://en.wikipedia.org/wiki/Exploratory_factor_analysis):
        #### Fitting Procedures:
            * used to estimate factor loadings and unique variances


    * [Confirmatory Factor
      Analysis](https://en.wikipedia.org/wiki/Confirmatory_factor_analysis):

### Types of factoring:
    * [Principal Component
      Analysis](https://en.wikipedia.org/wiki/Principal_component_analysis):

    * Canonical Factor Analysis: aka Rao's canonical factoring, uses principal axis
      method, unaffected by arbitrary rescaling, highest canonical correlation measure.

    * Common Factor Analysis: aka principal factor analysis, least no. of variables
      accounting for the common variance of a set of variables.

    * Image Factoring: based on correlation matrix of predicted variables, where each
      prediction is done via [multiple
      regression](https://en.wikipedia.org/wiki/Multiple_regression)

    * Alpha Factoring: based on maximizing reliability of factors, assumes random
      sampling of variables from universe of vars, (other methods assume fixed
      variables)

    * Factor Regression Model: Combinatorial model of factor and regression models,
        aka hybrid factor model with partially known factors

### Terminology:
    * Factor Loadings:
    * Interpreting Factor loadings:
    * Communality:
    * Spurious Solutions:
    * Uniqueness of Variable:
    * EigenValues/Characteristic Roots:
    * Extraction Sums of squared loadings:
    * Factor Scores:

### Criteria for number of Factors:
    * Horn's Parallel Analysis:
    * Velicer's MAP test:
     older methods
    * Kaiser Criterion:
    * Scree plot:
    * Variance explained criteria:

### Rotation Methods:
    * Varimax Rotation:
    * Quartimax Rotation:
    * Equimax Rotation:
    * Direct oblimin Rotation:
    * Promax Rotation: