It is true that the eigenvectors of the covariance matrix are equal to the principal components. Concretely, the first principal component (i.e. the largest eigenvector and associated largest eigenvalue) gives you the direction of the maximum variability in your data
The covariance matrix's eigenvectors are identical to its major components, which is a well-known fact. In more concrete terms, the direction of the most variability in your data is provided by the first principal component, which is represented by the largest eigenvector and related largest eigenvalue. After that, each primary component provides you decreasingly variable output. The fact that each primary component is orthogonal to the others is also important to note.
The above picture was taken from the Principal Component Analysis article on Wikipedia, which I previously referred to. This is a scatter plot of samples having a bivariate Gaussian distribution with a center at (1,3), a standard deviation of 3 roughly in the (0.878, 0.478) direction, and a standard deviation of 1 roughly in the orthogonal direction. The first principal component is the one with a standard deviation of 3, and the second is the orthogonal component. The shown vectors are the eigenvectors of the covariance matrix scaled by the relevant eigenvalue's square root and moved to place their tails at the mean.
Learn more about eigenvectors here:
brainly.com/question/14415835
#SPJ4