Thursday, July 29, 2010

Finished and plotted new data from the most recent simulation. Looks great! Made my best fit lines smoother. Went through old images and Maya's plots and picked out a few for my poster. Wrote more stuff for it.

things to do tomorrow (last day!):
-finish annotating most recent simulation program and plotting program
-finish up poster
-latex doc (and send to beth)
-think of witty, yet descriptive title for poster

Today was Gail's last day in the lab. She'll soon be in Austin, which is super exciting. She's helped me so much this summer and it's been great getting to know her!

Wednesday, July 28, 2010

Finishing up some last few things

As of this morning, I have a beautiful plot to show for my work this summer!... Well maybe not this summer, but I feel like it at least defines the work I've been doing this past month at least. It's similar to the one I last posted, but this time with sigma as determined by Gaussian statistics on there as well, and more neatly presented. As Beth correctly predicted, they are much lower than our peak values, thus vastly overestimating confidence levels, especially at levels higher than 90%. My last simulation should be done by the end of the week, and then I can apply this same plot to this data, which should look much smoother. Finishing up my latex document. Working on my poster. Doing both of these, scribing explanations of the work I've been doing (along with this blog), is really pushing me to fully understand all of the smaller details of my work that I've been hazy on. This may sound pretty corny, but the more I really think about it, the cooler all of this work get. Hopefully I'll have this poster done by tomorrow afternoon, and then I can spend Friday just organizing my directories, finishing and editing my latex doc, and then putting the new data from the most recent simulation into another plot.

Monday, July 26, 2010

Setting our Detection Threshold

The simulation ran this weekend successfully. Yay!

The next step was to plot the peak(S) levels at 75%, 90%, 99%, and 99.9%.
Maya was great and went through my entire code with me, and helped me fully debug my plotting program.

I added best fit lines to it, as well as the detection threshold set in the Invsibles paper. Best fit lines don't sound too difficult, but because there were 3 different densities of stellar densities, it was significant to have best fits for each of the 3 different densities. After working through frustrating for loops, I ended up just sorting them through their reference numbers, and it worked out quite nicely. I've attached the graph below! (click on it for a better view)

The main issue is the overlap between the best fit lines. That's because I had to create a y-axis array to plot, and so I just chose points that fell into each of the range of the three different densities.

Friday, July 23, 2010

4th fake field simulation is a go!

Coming to my 4th and final field simulation. Yay!

This one is designed so we can have a wide range of star densities, and will be able to match up to the simiulations that Maya has been working on. All algorithm parameters stay the same, with stars/deg^2 ranging from 100-5000. The way this is set up, this means 52 different star simulations. The field size has also been expanded to 5 x 10. It's all set up and ready to go for the weekend!

Things to do in the next few days:

- fix the peak vs. max avg value
I edited my structure for my simfields2plots program so that smoothing length was included, so that I could color-code this graph by smoothing length. But I had to re-run the program, so once that is complete, I'll be able to finish fixing that graph.
- make those graphs for each trial, and include description of trial on graph

- debug cumulative histogram for max values for the 3rd trial
As of now, I have cumulative histograms for time and peak for smoothing length = 15.0. I need to fix the max graph, as the overplotting isn't working quite right. Make those histograms for sl = 5.0 and 30.0 as well.

- provide in-depth accurate descriptions of each of the trials in my latex doc.

- organize my directories, and make a notes file for each of them

- get together notes from this summer.
I started out with a handy lab notebook, but abandoned it after its encounter with the rain. Seeing Maya's notebook though really makes me want to compile some idl/unix/other useful notes so that I'll be able to turn back to it after this summer.

On a side note, I was feeling kind of frustrated this morning. Maybe it was cause I wasn't getting enough sleep or something, but today everything was just taking longer than it should have, and I kept making embarrassing idl mistakes that I knew I could have avoided. But on a peppier note, 2 really great things happened this week:

1) Got to check out the telescope with Gail and Marjon. So incredibly cool. Really makes me want to get involved with public observing next year, and I cannot wait to get comf0rtable learning how to use it. Also had a great time with both of them.

2) Had a wonderful talk with Beth. Super excited for next semester. It's going to be tough, but so so great.

Wednesday, July 21, 2010

My 3 separate fields simulations didn't run yesterday (note to self: always wait for more than a few minutes to make sure code is up and running, and CHECK from home). But I successfully ran 2 of the 3 tests, for smoothing lengths of 15.0, and 30.0 arc min after some debugging. Since I was using a 5 x 5 field, I temporarily set it to a plain 2 x2 field in order to more quickly debug. I would use 1 x1, but I was worried about some of the coding, and wanted to make sure it would work for more than just 1 x 1 fields.

Created a new program for my new plots for these simulations, and set it up so that tomorrow morning I can run it (hopefully) before class. Having a good time figuring out how to plot them. Normally, it would be easier to plot, as there are fewer variables. But since all of my other histograms had lots of similar algorithm parameters, I could easily just copy my previously used code and input the new parameters, while now, though I'm still copying code, I'm changing it more drastically than before. Getting to know better what each line of my code really means, so that's been interesting.

It's starting to dawn on me that there's only a week and a half of work left -- aka, I better get my act together, clean up my directories and programs, add lots of notes, and document this all in my latex document.

As for my 3 x 3 field simulation....well, there were some issues, but they've all been resolved. I have my beautiful cumulative histograms, ready to go!

Tuesday, July 20, 2010

Quick update before I forget everything:

Finally made cumulative histograms for this 3 x 3 field simulation. Took me a while, as I had some coding bugs that were trickier than I thought. So far, just have them for max values, but hopefully by the end of the day, I'll have the other two. Also starting to run some new field simulations for Andromeda XIX galaxies. Hopefully they will go smoothly, and tomorrow I'll have some more data to work with.

Also had group meeting (our second to last for the summer!), and heard about everyone's work. Things seem like they're going pretty well for everyone... so so cool to hear about Maya's work especially -- I'm really interested to see where it ends up in the next couple of weeks.

Thursday, July 15, 2010

New Field Simulations

The first one is the same field simulation with the added pixel sizes of .25 and .5, an increased field of 3x3 deg, and 10,000 fields. Tried running it overnight, but it took way too long. Edited version (no combination with small pixel sizes and large scale lengths) will run tonight. Will check in tomorrow morning and check on its progress. Hopefully it will run smoothly (haha, get it?), and by Monday afternoon I'll have some cumulative histograms for it.

Next three simulations will have a pixel size of 2, a field size of 5x5 deg, nscale of 3, and a range of number of stars. Each simulation will have one smoothing length, and there will be 1000 trials per combination. And then we'll be inputting And XIX -like dwarfs!

Also, using the code from the Invisibles paper, I made a nice detection threshold plot. Not exactly if this is what Beth wants, but possibly.

Rafting tomorrow!

Tuesday, July 13, 2010

detection thresholds

Getting to the exciting part of this data structure and histograms: trying to redefine detection thresholds!

Earlier this morning, I plotted my max(S) values onto the image from The Invisibles paper (see below), to 1) verify that I'm getting values that are accurate, to compare values, notice interesting trends, etc.

Walsh, et al (2009)

For the values closest to the ones cited in the paper (nscl =1, pixel size = 1, smoothing length = 4.0), for 1,000 stars and 10,000 stars, I got 6.08 and 5.62 respectively. For all my different nscales, keeping the rest of my parameters the same, I get values that range from 5.25-6.08. In the Invisibles paper, max(S) values lie within that range, so considering that we only ran 100 fields, it's looking pretty good. Changing nscale shifted the numbers more than expected, so that's something to look back over.

Also started changing my program. This time through, I expanded my field size to a 3 degree x 3 degree field, got rid of smoothing length = 6.0, nscale = 1.0, and added pixel size =.25. This new pixel size should slow down computing time, so if it's anything more than a minute for each trial, then I think we'll end up nixing it. I tried the 3x3 field, but it's taking up to much memory right now, so I'm going with a 2x2 field for the time being. If it doesn't work tomorrow, Beth will help me possibly streamline my code so it uses less memory.

My time histogram overplots were really wonky. Note to self: always check images before coming to group meeting. Realized I had just quickly c&p-ed my titles and ranges, and editted the incorrect version. But I got them fixed and ready to go.

There's also a weird trend in some of my histograms. For nscale = 4.0, smoothing length = 6.0, the max(S) is really low. Hopefully this won't continue with a larger field size. If it does, it will be something to closely look into.

Things to do:

- Make cumulative histograms for more in-depth comparisons (can't use plothist --- need to use histogram)
- Run field simulation with 3x3 field size
- See how long trials take with 0.25 pixel size
- Update my latex document (it's been a while...)

All in all, really glad to see that my calculations are looking similar to the ones in the Invisibles paper!

Friday, July 9, 2010

pretty plots = success!

As of today, I have 18 colorful histograms for 89 different combinations of max values. Each plot has 5 plots of different smoothing lengths. The title of each graph is coded to read out the specific combination. There's a color-coded legend that shows the smoothing length for each graph. The x-range is designated by the minimum and maximum of each of the combinations.

All in all a success!

For next week, finish editting graphs, and work on table.

Thursday, July 8, 2010

pretty plots in the near future!

Sat down with Beth today and discussed the histogram plots I have so far. In order to better analyze them, my next task is to organize them in a way so that we can easily see the difference between the variables, say a smoothing length of 2 or a smoothing length of 4. I'm going to combine the 90 plots I have at the moment, and condense them into 18. Each of the graphs is going to be a specific variation of nscale length, pixel size and number of stars, with 5 different over plots of the 5 different smoothing lengths.

I've had to tinker with my original data structure in order to do this, and have additionally added a new structure which I will loop through to make the plots. The next thing I'll do is add in my actual plot statements with my overplots. For that, I'll also create titles that accurately depict the combination that it plots with a a general command.

Also fixed my calculation table, fixed the layout and added another calculation to it. Still have to add the columns that describe the combination for each calculated value. I'll do that with a for loop I think? I was thinking that I be able to do it with a where statement and just have one column that just lists the combination out...but that's for tomorrow.

I'm hoping that for tomorrow afternoon I'll have some pretty plots to show.

Wednesday, July 7, 2010

Almost accomplished my to-do list!

So I fixed my time plot issue. I decreased my bin size, and then made the intervals less frequent so the unit labels wouldn't overlap.

Also, created a table with my calculations for max, average, etc. with a for loop. For the for loop, I created a value that I hard-coded as the number of field variations we have. Still have that in there -- something for later today or tomorrow.

Started my And XIX analysis to see if/where it would be on the fundamental manifold. I have my mean surface brightness, but even in the Splash survey, there's no value for sig v for And XIX. I tried the calculation using 8.8 km/s, which is the value that Zaritsky uses for Ursa Minor and Draco, just to see if it would look reasonable. From what I've got so far, it looks like it too will be above the projection, just as Ursa Minor and Draco are, but much lower down on the x-axis on account of having a half-light radius of 1.7 kpc.

Tuesday, July 6, 2010

A lot to blog about:

Friday and today i worked on fixing all my for loops within my data structure. As of now, I have 5 for loops, in which I vary smoothing length, number of stars, nscale, and pixelsize. That means that I have a total of 89 variations in my fake sky fields, and for each variation, I have created 100 fields.

From this data, I've started to create plots and graphs of max, time, etc.

As a side project, I started reading up on the "fundamental manifold" which is supposed to encompass the local group dwarfs into a a single function. But this single function is a little wonky, and Ursa Minor and Draco don't align so neatly. So we'll see how useful this ends up being. Tomorrow, I'll plot And XIX on the graph that is in the Karitsky (2008) paper and see if it lies within the parameters of the function.

Things to do:

-fix time in plots --- my units of time are too large on my plots to show anything, so I need to shrink that down

-de-hardwire ref in multiple plots for loop --- so to create my plots, I made a for loop that I called ref, and inserted the number of variations I had - need to make it more general

-create table of calculations from structure -- re-run calculations I did for one variation and save into a table

-add and xix to manifold

-update latex doc

Thursday, July 1, 2010

for loops

Quick blog post to re-cap what I've done today:

Worked on using for loops to use the data structure ive created over the past couple of days to be able to change pixel size, smoothing length, nscale, etc. for the same field N-times. As of now, with help from Gail, I have 2 for loops that do this but only for different pixel sizes. I've tried a couple of ways of adding another for loop, but it takes up so much memory on the server that I know there's a better way out there. The books/online help articles say that for loops are really slow and that there are better ways of doing I'm toying with that idea, but at the same time, none of the examples I've seen so far are really using the for loops in the same context that I am. I'm hoping that my quick registration break at Nova will give me some time to think about it and when I come back I'll be able to figure it out.