I believe you can, if you try to follow the path of finding representations of the $SO(n)$ group over a given Hilbert space.
I really haven't done the calculation, but if it is the same, you would have something like this:
$H=L_2(\mathbb R^n,\mathbb C)$ would be the Hilbert Space that would correspond to spin 0 particles, and the representation of the $SO(n)$ group would be given by:
$\Phi: SO(n) \times H \rightarrow H$ , with $(\Phi(g)\psi)(x)=\psi(g^{-1}x)$
The generators of this simmetry group would correspond to the angular momentum operators. In this case, since there is no 'Internal Structure', this would be just the orbital angular momentum.
As for particles with spin:
The idea is the same, with one critical difference, the Hilbert Space you are working on. You would change $H=L_2(\mathbb R^n,\mathbb C)$ to include additional degrees of freedom, and the most direct way it to take the tensor product with another Hilbert Space. I don't know due to who, but it the following choice ends up leading to the famous 'Pauli Equation' (Schrodinger Equation with spin 1/2): $H=L_2(\mathbb R^n,\mathbb C^2) = L_2(\mathbb R^n,\mathbb C)\times L_2(\mathbb R^n,\mathbb C)$
In principle you don't know if it's possible to find a good representation of the SO(n) group in the above mentioned Hilbert space, so, to be able to work, you try $H=L_2(\mathbb R^n,\mathbb C^k)$
So, again, seeking representations, of the simetry group, you would end up with the following possibility:
$\Phi:H\times SO(n) \rightarrow H$ given by $(\Phi(g)\psi)(x ) = \pi_k(g)\psi(g^{-1}x)$
were $\pi_k: SO(n)\times \mathbb C^k \rightarrow \mathbb C^k$ is a representation of the $SO(n)$ group over the finite dimension $\mathbb C^k$. At least one k is a guaranteed to work, which is $k=n$, the others I'm not sure, in the case of $SO(3)$, you have a representation for each odd k (integer spin) , but it's possible to find a representation of the covering group $SU(2)$ for all k.
I find this subject very interesting, even though I haven't had the time to workout the calculations. Unfortunately my knowledge of the subject end here, so someone else will have to help you with the actual calculations.
If you have in hand, you can read Ballentine's discussion on angular momentum. I believe it was very enlightening when I was reading it, since it discuss this aspect of needing a internal simmetry space (the $\mathbb C^k$ above) and also works out explicitly the cases of spin 1/2 and 1, besides also discussing the case of spin 3/2.
Edit:
One thing I forgot to mention is about the algebra (generators) of the $SO(n)$ group, $\mathfrak{so}(n)$ of $n \times n$ anti-symmetric real matrices. So, the idea of of labeling the generators by $\Sigma_{ij}$ with a anti-symmetric index like you done above probably the right way to do it.
Also the commutation relations would be given by the $\mathfrak{so}(n)$ comutation relations, which I don't know by heart, and I'm not certain that it's exactly what you wrote above. Here something I've found on the web on the $\mathfrak{so}(n)$ algebras.
Continuation:
So, as Peter Kravchuk pointed out, the physical idea behind all this reasoning is the idea of law of transformation. So, in physics, the idea of transformation is captured by the idea of group which is a set with some kind of composition operation that turn possible the discussion of things like 'performing one transformation after another' or 'performing the inverse transformation'.
Most of the time, you don't want only to have and idea of composition and inverse of transformations, but you also want to have some sense of continuity and/or smoothness. The groups that are smooth, so they are 'differentiable', are called Lie Groups
Most of the times, you are not interested in the groups by themselves but in the 'effect' that they have when they act on some kind of physical object. If you have a set of physical objects $X$, what you want to do is to find some kind of function that change this objects, but still create valid physical objects of the same kind, i.e., some function $F: G\times X \rightarrow X$. This idea is the concept of group action.
Many times, objects of physical interest are modeled as vectors, in other words, things that make sense you 'add' and 'multiply by a scalar'. You can think on positions, velocities, and/or momentum of particles.
Also, there are also objects that you define point-to-point in your space, things like gravitational potential, electrical fields, and wave-functions!
All this objects are described by fields, i.e., in some sense, functions $E \rightarrow X$, where $E$ is you 'physical space', i.e., your space-time, which is generally either euclidean or minkowskian.
Finally, the idea of (linear) representation of groups is to seek transformation laws on objects that are themselves vectors. So, lets start with an example. You have the eucliean 3D space, $E=\mathbb R^3$, and you want to study rotations, i.e., transformations that preserve the usual 3D metric: $<x,y> = x_1y_1+x_2y_2+x_3y_3$
In other words, you want functions $A:\mathbb R^3 \rightarrow \mathbb R^3$ so that $<Ax,Ay>=<x,y>$ for all $x,y\in\mathbb R^3$ . You can prove that all functions of this kind are linear functions, and also that they form a group(and a lie-group also!), in the above mentioned sense. It's called the Orthogonal Group $O(3)$. Most of the time we also want to to preserve orientation, so we also demand that they satisfy $\det(A)=1$. This subset also forms a group, which is exactly the $SO(3)$, the (proper) rotations in 3D euclidean space.
If you have a group action that respects linear operations, $\Phi(A)(\alpha x+y) = \alpha(\Phi(A)x)+(\Phi(A)y)$ for all $x,y\in X$, $A\in G$ and $\alpha \in \mathbb F$ (think real and complex numbers) you call this action a representation. It's possible to have objects with 'mixed transformations laws', and usually you don't want that to happen, so you usually look for objects with a 'definite transformation law', and this is the same as to speak of irreducible representations of your group. From now on I'll use term representation as synonym for irreducible representation, until otherwise stated.
Other way to view the representations is to think $\Phi: G \rightarrow GL(X)$, where $GL(X)$ is the group of all invertible linear transformations (matrices) over X. This way you look for something that respects $\Phi(g_1g_2)= \Phi(g_1)\Phi(g_2)$. So that way you can think about looking for 'copies' of the original group over the group of invertible operators on the space of interest.
Now we start to have fun. If you have this symmetry group in the position space, you want to ask what happens to momentum when you rotate the positions. Since the set of momentums(velocities if you like) is also $\mathbb R^3$, you don't have any problem to set $\vec p' = A\vec p$.
So, just to precise what we are doing:
We have the physical position space: $E = \mathbb R^3$, and we have a group $G=SO(3)$ that acts on $E$, i.e., $\Phi: G\times E \rightarrow E$ by the 'trivial action' $\Phi(A)\vec x = A\vec x$
Now, we have the momentum space (set of all possible momentum
) $P$ which is also equal to $\mathbb R^3$, thus, we don't have any problem to have the same 'transformation laws' as the original positions, i.e., setting the representation $\Phi_p : G \times P \rightarrow P$ equal to the trivial one above. This is equivalent to say $\vec p' = \Phi_p(A)\vec p = A\vec p$.
Now, we can ask what happens when you have fields defined in Physical Space, i.e., 'smooth' (or almost) functions of some type: $\mathcal F=\{f|f:E\rightarrow X\}$ . Anyway, you can ask how this fields transform when you have a 'change of coordinates' induced by the action of $SO(3)$ on the physical space. What usually happens is that you set $\Phi_\mathcal F : G \times \mathcal F \rightarrow \mathcal F$ by putting $ (\Phi_\mathcal F(A) f)(x) = \Phi_X(A)f(A^{-1}x)$ where $\Phi_X$ is an action of G over X. If X is a vector space, you can also try finding $\Phi_X$ as a representation.
So the idea is that you first change the coordinates and then act on the object that results from it. The inverse inside the argument is the idea of active x passive rotation: You can either think that you actively rotate the whole universe one way, or rotate your coordinates the other way around. Ultimately, you use not the trivial representation of $SO(3)$ inside the coordinates, but the inverse representation.
If you have a scalar field, for example an electric potential(which is a function $\phi:\mathbb R^3 \rightarrow \mathbb R$), you don't expect it to change it's 'value' when you rotate your coordinates, but you do expect to change it's argument. So, by this physical considerations you can expect that $\phi$ changes as a 'scalar field', i.e., $\phi'(x ) = \phi(A^{-1}x)$.
Now, imagine that you have a (static) electric field $\vec E : \mathbb R^3 \rightarrow \mathbb R^3$. Now we expect that if you rotate, you not only your arguments change, but also you have some 'direct effect' on the 'vector field itself'. Since $\vec E(\vec x) \in \mathbb R^3$, you can use the same 'transformations laws' (representations) that you use for either the positions or the arguments of functions to act on the electric field. In the end, you end up with the 'transformation law for vector fields':
$ (\Phi_v(A)\vec E)(\vec x) = A(\vec E(A^{-1}\vec x))$
You read the above equation this way: "To transform a electric field under a rotation, you first get your position, transform it inversely so you can calculate the right argument, then evaluate the electric field at that point and after that you rotate the electric field the same way you would rotate normal positions (vectors)"
Now you have almost all the necessary information that you need. Now, remember that wave-functions are Complex Scalar fields defined over your physical space, with the additional propriety that they are 'square-integrable' (they have finite norm). You express that by saying that wave functions are members of a set $\{\psi:\mathbb R^3 \rightarrow \mathbb C | \int_{\mathbb R^3}\psi^*(x)\psi(x) d^3 x < \infty \} $ which is denoted by $L_2(\mathbb R^3,\mathbb C)$ of (Lebesgue) square-integrable complex functions.
Now, you want to ask which would be the possible transformations of wave-functions. Since you have that they are scalar functions, you expect that they transform as scalar fields: $(\Phi(A)\psi)(x) = \psi(A^{-1}x)$
Now, things get really interesting when you try to construct a 'multi-component wave function', which would represent the 'internal degrees of freedom' of your particles. To achieve that, you change from $L_2(\mathbb R^3,\mathbb C)$ to $L_2(\mathbb R^3,\mathbb C^k)$ so to have a 'internal space' $\mathbb C^k$
So, you go back again and ask: 'how does this things transform', or thinking another way, 'what are the possible ways to this things to transform?', since you are not obliged to set $k=3$, and if you think carefully, you are living on $\mathbb C^k$ not $\mathbb R^k$!
Since you already have the 'coordinate change' handled (i.e., you have what will become the 'orbital part' of the angular momentum), you need to ask what happens to the internal space.
So, you want to find all 'transformation laws' (i.e., representations) of objects of $\mathbb C^k$. In other words, you want to find all representations of $SO(3)$ on $\mathbb C^k$. This is usually a (very) difficult task, so, normally, you don't tackle it directly like that, but, in the end, you find that you would only have (honest) representations for $k = 2l+1$ , with $l\in \mathbb Z$, which you would interpret as 'integer spin representation'(although we haven't spoke the word spin till now!).
So, how do we find representations of the $SO(3)$ group? The standard method is to look to the 'infinitesimal transformations' of the group near the origin (in a more precise way, the tangent space at the identity of the group). This infinitesimal transformations form themselves a vector space, with an additional operation called the (lie) bracket, which is is some way a 'product'. Since vector spaces endowed with products are called algebras, this structures are called lie-algebras.
An lie bracket acts exactly as an commutator (this can be made precise), and in the case of matrix lie algebras (like the lie algebra of $SO(3)$), it's exactly the commutator of the usual matrix product. This way you can speak of 'commutation relations' of the elements of the Lie Algebra.
Just like the case of lie-groups, you can speak of Representations of Lie Algebras, just that instead of preserving the group composition operation, it preserves the lie-bracket operation.
Usually is simpler to find representations of the lie-algebra than it is to find representation for the original lie-group, since in the former you 'only' need to find operators that can act as generators for the image of the representation, and if they have the same 'commutation-relations' as a base for the original lie-algebra, all you have to do is to define the correspondence and extend by linearity.
So, how do we recover the information about the original group based on it's lie algebra?
The idea is that you can construct (under some conditions) representations for the lie-group based on the representations of the lie-algebra. This is done by 'exponentiating' (that idea of putting $U(\vec\theta)=e^{-\frac{i}{\hbar}\vec \theta \cdot \vec J}$) the elements of the lie algebra to form an element of the group. On a general setting, this only valid locally.
If you try searching for representations of the lie-group of $SO(3)$ (denoted $\mathfrak{so}(3)$), which is exactly the algebra of $3\times 3$ anti-symmetric matrices, you find that it have representations in all $\mathbb C^k$.
Unfortunately(or not), you also find that you can't recover a similar representation for $SO(3)$ for even k. This is related with the 'double valuededness' of the representations for even k. This is the 'extra -1 factor' that semi-integer spin gain with a full $2\pi$ rotation, and also the 'need' for a $4\pi$ rotation to fully go back to the origin(identity).
What you end up doing it to look for the representations of the group that is actually generated by exponentiating the lie-algebra, which in the case of $SO(3)$ is $SU(2)$. Since locally the two are 'essentialy the same' and for each element of $SO(3)$ there is 2 elements of $SU(2)$, the latter is called the double cover of the former. For even k, these are the 'spinorial representations'
Finally, you prove that there is all a representation of $SU(2)$ over any $\mathbb C^k$, so you can accommodate both integer and half-integer spin using $SU(2)$ as your 'acting rotation' group. You can do this because you can recover the rotations on the euclidean 'physical space' using it.
To recover the idea of spin, it's necessary to have a way to 'measure' the total spin of the particle in question, which is done via $S_x^2+S_y^2 + S_z^2 = S^2$. So, how to interpret this object?
The idea is that it's what's called the 'casimir invariant' of the group, and you use it to classify all (irreducible) representations of your algebra, and thus, of your original group. That way you have pretty much all '3D spin theory' built here.
So, from here, you can understand my original suggestion: If you want to search for a higher dimensional spin, you start with a higher dimensional position space $E=\mathbb R^n$, and repeat the same questions that I developed here:
1) the usual euclidean inner-product is $<x,y>= \sum_{i=1}^n x_iy_i$, and so the groups that preserves it and also preservers orientation ($\det A=1$) is called $SO(n)$
2) the space of k-components wave-functions is $H=L_2(\mathbb R^n,\mathbb C^k)$ and you try seeking representations of $SO(n)$ over H.
3) the covering group of $SO(n)$ is called spin group Spin(n)
I belive that it's possible to find irreducible representations of the Spin(n) for all k, but I'll confirm later. Being possible, it makes possible a similar interpretation as to usual 3D spin. As someone mentioned here or in other topic, there is Cartan's Book as a good reference on the subject. There rest is on my original answer.
This post imported from StackExchange Physics at 2014-03-09 08:45 (UCT), posted by SE-user user23873