-1

I have two vectors for eg

Aideal=rand(256,1);

and A_estimated=rand(256,1);

How can I measure the similarity ? By similarity I mean I want each element of A_estimated to be almost same as that of Aideal.

Can Anyone please help.

sanjeev
  • 39
  • 1
  • 6
  • And what do you mean by `almost same`? – Divakar Sep 27 '17 at 19:48
  • 1
    I am doing an optimization problem. My function is Aideal=F(Ein); I am finding approximations of Ein such that I get the same output. which I call A_estimated. – sanjeev Sep 27 '17 at 19:50
  • That depends on your situation. Usually minimising the sum of squared distances is not bad, but you have to tell a bit more about your optimisation problem to say for sure. – Leander Moesinger Sep 27 '17 at 20:03
  • Sure! I have a function like Eout=Atra*A*Ein; where Atra=transpose(A); A is a 1D matrix with 256 columns. Ein is a 1D matrix with 256 rows. I don't know A or A transpose. I just get the output Eout for Ein. I am trying to find approximations of A. Idea is if A*Ein = 1 then my Eout becomes A transpose. So in each iteration I find different Eout and assume that Eout is Atranspose and calculate the Eout for my intial Ein Eoutnew=Atra_Estimated*A_Estimated_Ein(initial); Now if the error between Eoutnew and Eout is less then I can be sure that I have the actual A. – sanjeev Sep 27 '17 at 20:11

2 Answers2

3
mae(A-B) % mean(abs(A-B)) % Average or mean value of array

sae(A-B) % sum(abs(A-B)) % Sum absolute error performance function

norm(A-B,1) % sum(abs(A-B)) % 1-norm of the vector, which is the sum of the element magnitudes.

norm(A-B,inf) % max(abs(A-B)) % maximum absolute row sum of the diff of vectors.

mse(A-B) % mean((A-B).^2) % Mean of Sum of squared error

sse(A-B) % sum((A-B).^2)  %  Sum of squared error 

norm(A-B) % sqrt(sse(A-B)) 
Dr. X
  • 2,890
  • 2
  • 15
  • 37
0

If you want to compare two vectors with respecto cosine similarity below code is enough for you

function [similarity] = CosineSimilarity(x1,x2)
%--------------------------------------------------------------------------
% Syntax:       [similarity] = CosineSimilarity(x1,x2);
% 
% Definition:   Cosine similarity is a measure of similarity between two
%       non-zero vectors of an inner product space that measures 
%       the cosine of the angle between them. The cosine of 0° is 
%       1, and it is less than 1 for any other angle. It is thus a
%       judgment of orientation and not magnitude: two vectors 
%       with the same orientation have a cosine similarity of 1, 
%       two vectors at 90° have a similarity of 0, and two vectors
%       diametrically opposed have a similarity of -1, independent
%       of their magnitude. Cosine similarity is particularly used
%       in positive space, where the outcome is neatly bounded in
%       [0,1]. The name derives from the term "direction cosine":
%       in this case, note that unit vectors are maximally 
%       "similar" if they're parallel and maximally "dissimilar"
%       if they're orthogonal (perpendicular). This is analogous 
%       to the cosine, which is unity (maximum value) when the 
%       segments subtend a zero angle and zero (uncorrelated) 
%       when the segments are perpendicular.[1].
%               
% Inputs:       [x1] is a vector
%               [x2] is a vector
%               
% Outputs:      [similarity] is between 0 and 1
%                             
% Complexity:   No
%
% Dependencies  No dependency.
%               
% Author:       Ugur Ayan, PhD
%               ugur.ayan@ugurayan.com.tr
%               http://www.ugurayan.com.tr
%               
% Date:         May 15, 2016
%
% Refrences     [1] https://en.wikipedia.org/wiki/Cosine_similarity
%--------------------------------------------------------------------------
if ( length (x1) == length(x2) )
    similarity = sum(x1.*x2) / (norm(x1) * norm(x2));
else
   disp('Vectors dimensions does  not match'); 
end
Dr. X
  • 2,890
  • 2
  • 15
  • 37
  • If you want any additional similarty function, please give the name... – Dr. X Sep 27 '17 at 20:17
  • Thanks for the codes. I guess this is not the measure I am looking for. I need something like Root mean square error. – sanjeev Sep 27 '17 at 20:22