[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Converting from symbolic to matrix
From: |
theta |
Subject: |
Converting from symbolic to matrix |
Date: |
Sun, 21 Nov 2010 00:42:44 -0800 (PST) |
I have a cell array defined as:
g =
{
[1,1] =
(2.0)*x1+(40.0)*(-x4+x1)^(3.0)+(20.0)*x2
[1,2] =
(4.0)*(-(2.0)*x3+x2)^(3.0)+(20.0)*x1+(200.0)*x2
[1,3] =
-(8.0)*(-(2.0)*x3+x2)^(3.0)-(10.0)*x4+(10.0)*x3
[1,4] =
(10.0)*x4-(40.0)*(-x4+x1)^(3.0)-(10.0)*x3
}
After substituting particular values if x1,x2,x3,x4 using subs() I got gk
as:
gk =
{
[1,1] =
306.0
[1,2] =
-144.0-1.4693486515086033901E-18*I
[1,3] =
-2.0+2.9386973030172067802E-18*I
[1,4] =
-310.0
}
Firstly, how do I convert the 2nd and 3rd index values to real values. Tha
imaginary part just comes from nowhere. All my calculations are in real
numbers.
Secondly, how do I convert this cell array gk into a matrix to perform
matrix multiplication. Every time I try to assign one of these elements into
an array I get an error saying wrong type `ex' - What does this mean and
what is its workaround.
I would be grateful for any help.
Thanks
Apoorv
--
View this message in context:
http://octave.1599824.n4.nabble.com/Converting-from-symbolic-to-matrix-tp3052206p3052206.html
Sent from the Octave - General mailing list archive at Nabble.com.
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- Converting from symbolic to matrix,
theta <=