Thursday, July 29, 2010

Finished and plotted new data from the most recent simulation. Looks great! Made my best fit lines smoother. Went through old images and Maya's plots and picked out a few for my poster. Wrote more stuff for it.

things to do tomorrow (last day!):
-finish annotating most recent simulation program and plotting program
-finish up poster
-latex doc (and send to beth)
-think of witty, yet descriptive title for poster



Today was Gail's last day in the lab. She'll soon be in Austin, which is super exciting. She's helped me so much this summer and it's been great getting to know her!

Wednesday, July 28, 2010

Finishing up some last few things

As of this morning, I have a beautiful plot to show for my work this summer!... Well maybe not this summer, but I feel like it at least defines the work I've been doing this past month at least. It's similar to the one I last posted, but this time with sigma as determined by Gaussian statistics on there as well, and more neatly presented. As Beth correctly predicted, they are much lower than our peak values, thus vastly overestimating confidence levels, especially at levels higher than 90%. My last simulation should be done by the end of the week, and then I can apply this same plot to this data, which should look much smoother. Finishing up my latex document. Working on my poster. Doing both of these, scribing explanations of the work I've been doing (along with this blog), is really pushing me to fully understand all of the smaller details of my work that I've been hazy on. This may sound pretty corny, but the more I really think about it, the cooler all of this work get. Hopefully I'll have this poster done by tomorrow afternoon, and then I can spend Friday just organizing my directories, finishing and editing my latex doc, and then putting the new data from the most recent simulation into another plot.

Monday, July 26, 2010

Setting our Detection Threshold

The simulation ran this weekend successfully. Yay!

The next step was to plot the peak(S) levels at 75%, 90%, 99%, and 99.9%.
Maya was great and went through my entire code with me, and helped me fully debug my plotting program.

I added best fit lines to it, as well as the detection threshold set in the Invsibles paper. Best fit lines don't sound too difficult, but because there were 3 different densities of stellar densities, it was significant to have best fits for each of the 3 different densities. After working through frustrating for loops, I ended up just sorting them through their reference numbers, and it worked out quite nicely. I've attached the graph below! (click on it for a better view)


The main issue is the overlap between the best fit lines. That's because I had to create a y-axis array to plot, and so I just chose points that fell into each of the range of the three different densities.

Friday, July 23, 2010

4th fake field simulation is a go!

Coming to my 4th and final field simulation. Yay!

This one is designed so we can have a wide range of star densities, and will be able to match up to the simiulations that Maya has been working on. All algorithm parameters stay the same, with stars/deg^2 ranging from 100-5000. The way this is set up, this means 52 different star simulations. The field size has also been expanded to 5 x 10. It's all set up and ready to go for the weekend!

Things to do in the next few days:

- fix the peak vs. max avg value
I edited my structure for my simfields2plots program so that smoothing length was included, so that I could color-code this graph by smoothing length. But I had to re-run the program, so once that is complete, I'll be able to finish fixing that graph.
- make those graphs for each trial, and include description of trial on graph

- debug cumulative histogram for max values for the 3rd trial
As of now, I have cumulative histograms for time and peak for smoothing length = 15.0. I need to fix the max graph, as the overplotting isn't working quite right. Make those histograms for sl = 5.0 and 30.0 as well.

- provide in-depth accurate descriptions of each of the trials in my latex doc.

- organize my directories, and make a notes file for each of them

- get together notes from this summer.
I started out with a handy lab notebook, but abandoned it after its encounter with the rain. Seeing Maya's notebook though really makes me want to compile some idl/unix/other useful notes so that I'll be able to turn back to it after this summer.


On a side note, I was feeling kind of frustrated this morning. Maybe it was cause I wasn't getting enough sleep or something, but today everything was just taking longer than it should have, and I kept making embarrassing idl mistakes that I knew I could have avoided. But on a peppier note, 2 really great things happened this week:

1) Got to check out the telescope with Gail and Marjon. So incredibly cool. Really makes me want to get involved with public observing next year, and I cannot wait to get comf0rtable learning how to use it. Also had a great time with both of them.

2) Had a wonderful talk with Beth. Super excited for next semester. It's going to be tough, but so so great.

Wednesday, July 21, 2010

My 3 separate fields simulations didn't run yesterday (note to self: always wait for more than a few minutes to make sure code is up and running, and CHECK from home). But I successfully ran 2 of the 3 tests, for smoothing lengths of 15.0, and 30.0 arc min after some debugging. Since I was using a 5 x 5 field, I temporarily set it to a plain 2 x2 field in order to more quickly debug. I would use 1 x1, but I was worried about some of the coding, and wanted to make sure it would work for more than just 1 x 1 fields.

Created a new program for my new plots for these simulations, and set it up so that tomorrow morning I can run it (hopefully) before class. Having a good time figuring out how to plot them. Normally, it would be easier to plot, as there are fewer variables. But since all of my other histograms had lots of similar algorithm parameters, I could easily just copy my previously used code and input the new parameters, while now, though I'm still copying code, I'm changing it more drastically than before. Getting to know better what each line of my code really means, so that's been interesting.

It's starting to dawn on me that there's only a week and a half of work left -- aka, I better get my act together, clean up my directories and programs, add lots of notes, and document this all in my latex document.

As for my 3 x 3 field simulation....well, there were some issues, but they've all been resolved. I have my beautiful cumulative histograms, ready to go!

Tuesday, July 20, 2010

Quick update before I forget everything:

Finally made cumulative histograms for this 3 x 3 field simulation. Took me a while, as I had some coding bugs that were trickier than I thought. So far, just have them for max values, but hopefully by the end of the day, I'll have the other two. Also starting to run some new field simulations for Andromeda XIX galaxies. Hopefully they will go smoothly, and tomorrow I'll have some more data to work with.


Also had group meeting (our second to last for the summer!), and heard about everyone's work. Things seem like they're going pretty well for everyone... so so cool to hear about Maya's work especially -- I'm really interested to see where it ends up in the next couple of weeks.

Thursday, July 15, 2010

New Field Simulations

The first one is the same field simulation with the added pixel sizes of .25 and .5, an increased field of 3x3 deg, and 10,000 fields. Tried running it overnight, but it took way too long. Edited version (no combination with small pixel sizes and large scale lengths) will run tonight. Will check in tomorrow morning and check on its progress. Hopefully it will run smoothly (haha, get it?), and by Monday afternoon I'll have some cumulative histograms for it.

Next three simulations will have a pixel size of 2, a field size of 5x5 deg, nscale of 3, and a range of number of stars. Each simulation will have one smoothing length, and there will be 1000 trials per combination. And then we'll be inputting And XIX -like dwarfs!


Also, using the code from the Invisibles paper, I made a nice detection threshold plot. Not exactly if this is what Beth wants, but possibly.



















Rafting tomorrow!

Tuesday, July 13, 2010

detection thresholds

Getting to the exciting part of this data structure and histograms: trying to redefine detection thresholds!


Earlier this morning, I plotted my max(S) values onto the image from The Invisibles paper (see below), to 1) verify that I'm getting values that are accurate, to compare values, notice interesting trends, etc.




Walsh, et al (2009)



For the values closest to the ones cited in the paper (nscl =1, pixel size = 1, smoothing length = 4.0), for 1,000 stars and 10,000 stars, I got 6.08 and 5.62 respectively. For all my different nscales, keeping the rest of my parameters the same, I get values that range from 5.25-6.08. In the Invisibles paper, max(S) values lie within that range, so considering that we only ran 100 fields, it's looking pretty good. Changing nscale shifted the numbers more than expected, so that's something to look back over.

Also started changing my program. This time through, I expanded my field size to a 3 degree x 3 degree field, got rid of smoothing length = 6.0, nscale = 1.0, and added pixel size =.25. This new pixel size should slow down computing time, so if it's anything more than a minute for each trial, then I think we'll end up nixing it. I tried the 3x3 field, but it's taking up to much memory right now, so I'm going with a 2x2 field for the time being. If it doesn't work tomorrow, Beth will help me possibly streamline my code so it uses less memory.

My time histogram overplots were really wonky. Note to self: always check images before coming to group meeting. Realized I had just quickly c&p-ed my titles and ranges, and editted the incorrect version. But I got them fixed and ready to go.

There's also a weird trend in some of my histograms. For nscale = 4.0, smoothing length = 6.0, the max(S) is really low. Hopefully this won't continue with a larger field size. If it does, it will be something to closely look into.




Things to do:

- Make cumulative histograms for more in-depth comparisons (can't use plothist --- need to use histogram)
- Run field simulation with 3x3 field size
- See how long trials take with 0.25 pixel size
- Update my latex document (it's been a while...)


All in all, really glad to see that my calculations are looking similar to the ones in the Invisibles paper!

Friday, July 9, 2010

pretty plots = success!

As of today, I have 18 colorful histograms for 89 different combinations of max values. Each plot has 5 plots of different smoothing lengths. The title of each graph is coded to read out the specific combination. There's a color-coded legend that shows the smoothing length for each graph. The x-range is designated by the minimum and maximum of each of the combinations.

All in all a success!


For next week, finish editting graphs, and work on table.

Thursday, July 8, 2010

pretty plots in the near future!

Sat down with Beth today and discussed the histogram plots I have so far. In order to better analyze them, my next task is to organize them in a way so that we can easily see the difference between the variables, say a smoothing length of 2 or a smoothing length of 4. I'm going to combine the 90 plots I have at the moment, and condense them into 18. Each of the graphs is going to be a specific variation of nscale length, pixel size and number of stars, with 5 different over plots of the 5 different smoothing lengths.

I've had to tinker with my original data structure in order to do this, and have additionally added a new structure which I will loop through to make the plots. The next thing I'll do is add in my actual plot statements with my overplots. For that, I'll also create titles that accurately depict the combination that it plots with a a general command.

Also fixed my calculation table, fixed the layout and added another calculation to it. Still have to add the columns that describe the combination for each calculated value. I'll do that with a for loop I think? I was thinking that I be able to do it with a where statement and just have one column that just lists the combination out...but that's for tomorrow.

I'm hoping that for tomorrow afternoon I'll have some pretty plots to show.

Wednesday, July 7, 2010

Almost accomplished my to-do list!

So I fixed my time plot issue. I decreased my bin size, and then made the intervals less frequent so the unit labels wouldn't overlap.

Also, created a table with my calculations for max, average, etc. with a for loop. For the for loop, I created a value that I hard-coded as the number of field variations we have. Still have that in there -- something for later today or tomorrow.

Started my And XIX analysis to see if/where it would be on the fundamental manifold. I have my mean surface brightness, but even in the Splash survey, there's no value for sig v for And XIX. I tried the calculation using 8.8 km/s, which is the value that Zaritsky uses for Ursa Minor and Draco, just to see if it would look reasonable. From what I've got so far, it looks like it too will be above the projection, just as Ursa Minor and Draco are, but much lower down on the x-axis on account of having a half-light radius of 1.7 kpc.

Tuesday, July 6, 2010

A lot to blog about:

Friday and today i worked on fixing all my for loops within my data structure. As of now, I have 5 for loops, in which I vary smoothing length, number of stars, nscale, and pixelsize. That means that I have a total of 89 variations in my fake sky fields, and for each variation, I have created 100 fields.

From this data, I've started to create plots and graphs of max, time, etc.


As a side project, I started reading up on the "fundamental manifold" which is supposed to encompass the local group dwarfs into a a single function. But this single function is a little wonky, and Ursa Minor and Draco don't align so neatly. So we'll see how useful this ends up being. Tomorrow, I'll plot And XIX on the graph that is in the Karitsky (2008) paper and see if it lies within the parameters of the function.




Things to do:

-fix time in plots --- my units of time are too large on my plots to show anything, so I need to shrink that down

-de-hardwire ref in multiple plots for loop --- so to create my plots, I made a for loop that I called ref, and inserted the number of variations I had - need to make it more general

-create table of calculations from structure -- re-run calculations I did for one variation and save into a table

-add and xix to manifold

-update latex doc

Thursday, July 1, 2010

for loops

Quick blog post to re-cap what I've done today:

Worked on using for loops to use the data structure ive created over the past couple of days to be able to change pixel size, smoothing length, nscale, etc. for the same field N-times. As of now, with help from Gail, I have 2 for loops that do this but only for different pixel sizes. I've tried a couple of ways of adding another for loop, but it takes up so much memory on the server that I know there's a better way out there. The books/online help articles say that for loops are really slow and that there are better ways of doing this....so I'm toying with that idea, but at the same time, none of the examples I've seen so far are really using the for loops in the same context that I am. I'm hoping that my quick registration break at Nova will give me some time to think about it and when I come back I'll be able to figure it out.

Wednesday, June 30, 2010

stellar population and data structure both completed!

What started out as a sort of slow day, ended up turning out really great.


Two turning points:
1) I finished my stellar population! all 216 files are neatly saved to a directory and are ready for use!
2) fixed my data structure! Turned it into a fits file and started plotting.


So I was feeling kind of off in the morning. I was slowly working on this data structure and didn't really understand my coding and kept running into different errors. But with Beth's help, I got it up and running and started tinkering with the data. I made histograms for max, min, and time, as well as a couple different error plots, that are neatly stored in a file. Learned the handy !p.multi trick and used it to neatly format my graphs. Beth asked for a histogram of minimum, but all of the minimum values are at 0 (this is with 10, 000 stars), so the graph was obsolete (and unplottable for that matter). I calculated the maximum of our max values, and did the same for min, average, and average of the time it took for our calculation.

A few things to check:
- time error plot (I think the values should be smaller)
- median vs. average plot

also have to update my latex doc with the work of the past week.

Feeling good!

Tuesday, June 29, 2010

So much progress since this morning.

With lots of help from Beth, I fixed my function and got rid of my hard coding. Fixed my smoothing filter, and toyed around with different smoothing lengths, different number of times of exp scaled length, pixel size, etc. It is so cool! Especially being able to look at the spatial smoothing section of the Walsh (2008) paper, and be able to understand what they're talking about. Really interested in the comparison between low/high numbers of exp scaled lengths and smoothing lengths. Interested to compare these along with other variables....which brings me to my next task: setting up a data structure that would compute these fields N times and compute values for the max, min, average, etc.

So I just started setting up my empty structure. Going okay so far, but lots of debugging in the near future. Was having issues with what I thought was the fltarr command, so I set all of them to 0.0 to try to debug, but that's not turning out so well either. I'll get it tomorrow or later today though. Looking forward to having a large data set that will be helpful.



things to do:
- debug function
- update latex doc with Koposov info (already updated with Walsh info)
- finish stellar population

Monday, June 28, 2010

Simulating fields

Today and last Friday, I've been working on my next major task: creating and using a function that simulates random uniform fields. This function basically generates N number of random x and y-coordinates that represent a random distribution of stars. Though it sounds simple, I've spent quite some time and patience on it. Last Friday and early this morning, I had problems just getting data points to show up on my plot -- I kept getting just a dark graph. Found the problem in the fact that I neglected the different between arcmin and degrees. Fixed that in the procedure, but then didn't fix the same thing in the function....but it worked after that! I originally wanted to make my function/procedure as general as possible, so that someone could put in inputs that span a wide range and it would be effective, but for now, it's pretty much set to data points spanning a -.5, .5 range.

After plotting the function, I applied an exponential filter and smoothed the plot. I spent some time tinkering around with the different filter parameters, changing how my filter smooths the data, but didn't end up seeing very dramatic results. Going to look at it with Beth tomorrow morning and hopefully change a few things.

Spent some time finishing up my stellar population - 2/3 of the way done! Going to hopefully finish it by the time I leave work today. Also added a few more sections to my latex document so I don't forget what I've been working on (though this blog makes sure that doesn't happen!).

things to work on:
- finish updating latex document
- keep working on fake field function and improve filter
- finish stellar pop





-------
update since 1 hour ago:

i might cry:
i realized that all of the combinations with al = 0, i did for bess +2mass instead of sdss, so i deleted all of them and redid them. just finished redoing them all, and realized i used filter 1 instead of filter 4. just deleted all of them. back to having only 1/3 of my stellar population done. something for tomorrow morning!

Thursday, June 24, 2010

Accomplished my previously set goals for today. My M31 functions work great. Tested them out with data from And I, And III, as well as And X, and they seem to be in tip-top shape. Also re-started my stellar population database. 86 down!

Next focus is looking at the angular sizes and exponential scale length used when finding dwarfs. We'll be looking at And XIX specifically as it was much larger than the others. The question of the moment, is if angular size was changed (especially, if it was larger) would more dwarfs have been found? Hopefully this will become clearer to me after reading the Koposov (2008) paper.

update from yesterday - everything is functional!

Gotta love the wordplay. Fixed my fraction of light function. Now, you can input any absolute magnitude, and it will return a percentage of light visible in that range. Also created functions for my distance calculations, so you can input RA, DEC, and distance from MW, and you will get the distance from M31. Did the same for my error calculation.

Also went to a lecture on Neanderthals and genetic differences between them and modern humans. Neat stuff.


Things I'm going to work on today:

- checking random known distances to ensure I correctly coded my functions
- add header to all of my procedures and functions that will be shared
- work on updating our stellar population from Dotter (aiming to be done by the end of the week)
- Hess diagrams sometime in the future?

Tuesday, June 22, 2010

Feeling pretty accomplished after today. Yesterday, when Beth mentioned that I would probably have a Dotter function and plot by the end of the day, I was a little intimidated and slightly overwhelmed. But it all worked out! I fixed my read file from yesterday and read in the data from Dotter, converted absolute magnitude into luminosity, and then found the fraction of light contained within the magnitude limit by multiplying the luminosity by # of stars to get total brightness. I'm proud of my calculation, as well as my file and my programming. It feels really good to be able to do this kind of thing. It felt good being able to present something at the group meeting, though next week I'll be able to explain what I'm doing better. Maya had some great images -- so cool to see her creating the diagrams that we've seen in every paper we've read so far. I'm excited to eventually have CMDs created from our own database in the future.


After a great group meeting, I moved onto creating a function and using the interpol command in order to interpolate data for the values that I don't have specific data points for. Slightly disappointed, as I had trouble with my R-input. I ended up just typing in a series of integers that spanned my x-range. I wanted to write a command that wasn't manual so it'd be easier to change later on -- something to fix tomorrow. Also had a few issues with the function part as well. Have to read up on that for tomorrow.


To end the day on a good note, I did my first 2 Dotter isochrones/luminosity functions. 2 down, 106 to go! Tomorrow morning, I hope to have figured out this interpol business and have that function down.

Monday, June 21, 2010

distance calculation = done! On to bigger and better projects...

Finished with my distance calculation and error! Spent the morning checking my equations and verifying my distances. Also updated my LaTeX doc with my equations and notes.


Met up with Beth to discuss upcoming projects. My next task is to take Dartmouth stellar evolution data to model the fraction of visible light from stars in the dwarfs versus their magnitudes. Since then, I've started my first read file and am formatting the data into a table I can use in IDL to create a plot. Looking forward to having this plot by tomorrow afternoon if all goes well!



(reminder: put notes about parameter errors in LaTeX)

Friday, June 18, 2010

so close to finishing this error calc!

Worked on latex document, entered all the calculations I've done these past few weeks, described some of the tasks I've worked on, etc.

Also re-ran that data calculation Beth wanted....forgot about degrees conversion.

Then I took a look at my distance error calculation -- and realized how much work I had to do on it. Originally, I just assumed the main uncertainty was only coming from the distance to MW. After talking it over with Maya, I realized I needed to include the errors from the RA and DEC values. So I assumed the largest errors, and assigned those for the rest of the RA/DEC errors, and then re-ran the fitsread files. My error calculation now has 4 main steps: 1) finding the errors for the x,y,z coordinates, 2) combining the errors from the dwarf coordinates to the errors from the M31 coordinate, 3) combining all of the x,y,z errors into one error 4) taking the error after the square root of this value was taken. As of now, I have all of my formulas written up, I just have to re-program them. I'm positive about the accuracy in steps 2-4 of my uncertainty calculation, but I want to re-check my partial derivatives on mathematica before I enter them into my .pro file. Until Monday!

Thursday, June 17, 2010

Started getting into my latex document. Not quite sure what to put in it, but I made a section and talked about my luminosity checking function and included the equation. Just tinkering around with it, but it's been fun so far.

Ran distance calculation in idl with the correct parameter values! It's looking great! Two errors (And IX and And XX) look kind of high, so that's something to work on. All of the errors look reasonable, so I don't think it's an error with the uncertainty equation.... (note: go over that error equation with Beth and see if it is correct)

Also did a mini calculation for Beth. She wanted our parameter values converted into galactic coordinates and then sorted....it was a pretty simple task and I was excited that I understood how to do it without any significant problems....then I realized that I messed up my SORT command and just sorted the b column. Oops! But with Gail's help I got it in the end.

At a point where I'm not sure how I'm going to spend the rest of the afternoon...there's always trying to find those last 7 CSB data points. I'm slightly worried about finding those. Not sure where they're hiding, but after extensive searching I still don't have them....

^^update. Found a value for Leo T (actually had it the entire time, but forgot to enter it). Pretty sure there aren't values for Pisces II and Segue III....but the Andromeda ones should have CSB values. Especially cause there was a KECK survey....hm.

Wednesday, June 16, 2010

success!

Yay! Finally debugged my distance formula earlier this afternoon. After a mad series of calculations (that ended up killing the batteries on Maya's calculator - oops! ), and figuring out my radian/degrees errors, it is correct as far as I can tell. Once Maya puts in the new values for the table, and I re-run IDL it will be finished!

Also started my first latex document. I'm psyched to have all of our work in one place, all formatted nicely, at the end of the summer.

A slight downside is that for And XII, XIII, XV and XVI, Leo T, Segue II and III, and Pisces II, I couldn't find central surface brightnesses. For And XII, there was a value in other units -- but when I converted it using their taken data, it was still off by 0.2, so I figured it'd be better to keep searching. Something to keep working on.

other things:
- go over errors for distances
- re-run idl with complete set of correct values (don't forget!)
- continue with new latex doc
- hess diagrams

Tuesday, June 15, 2010

It feels great to get back to the lab -- but not the best day of work so far.

I thought that I could get through the rest of the central surface brightness values pretty quickly. I went through the data to consolidate sources in order to make it more consistent. With that, and adding other values here and there, as well as documenting everything, it took about half of the day. But just 6 more CSB values needed...

Spent the rest of my time working on my distance equation...kept thinking that I had fixed it, but it still isn't looking right. I think I'll have it by tomorrow though.


Also heard a neat lecture from a ocean engineering prof who gave an impromptu talk this morning. Cool stuff.


things to do:
finish CSB table
fix equation
hess diagrams

Friday, June 4, 2010

End of week 2

Feeling really good about today. I spent the morning finding central surface brightness values for the galaxies -- I have about 70% of them. Added some error values to our parameter measurements along the way. Also fixed my M31 equation. But I'm most proud of my error calculation. Using equations from the error reading we did this week, I put together what I think is a correct uncertainty formula. I was kind of daunted by the error calculation, especially the computer programming aspect of it, but I think I've done it (and without many error messages!). I now have a nice table with the galaxies listed with their xyz coordinates, distances and distance errors. Going to spend the rest of the afternoon searching for more central surface brightness values!




For next week:
- Gather more central surface brightness values
- Correct where function for M31 distances in order to separate them from MW satellites
- Go over error formula with Beth
- Figure out why some MW distances are close to M31 (? possibly calculation error?)
- add a column to our table documenting how these values were found (eek...kind of forgot about this until just now...will get on that pronto)



Until Monday!

Thursday, June 3, 2010

M31 Distance Calculation

Today I spent the day working on a calculation to determine the distances from the Andromeda satellites to the center of the Andromeda galaxy. The morning consisted of a complicated method to get this distance. Upon working at this for quite some time, I realized I should just convert this into Cartesian coordinates. After lunch, Beth helped me get on track with this. I spent the afternoon doing a few sample calculations, and then entering the functions into our FITS binary table. As of now, I have the values for the x, y, and z coordinates, along with the distance and distance error values (all converted into radians) in neat columns in a new file. Surprisingly, when the distance was calculated, some dwarfs other than the And satellites had values up to 1.9 (MW satellites should have a value of 0)...gotta take a closer look at that. Also.....

Complications to work out:
- format when name of dwarf is included in table
- the distance error for the last two values (check and see if error is that large, or if it is a calculation flaw)
- where command (know that it is different from regular old code, but needs tweaking)

Hopefully by tomorrow, this table will be in good form. Maybe add it to big data table (or subset of it)...? We'll see. Next week, I hope to make a plot or two using these distances -- any blatant errors should pop up then.

Also for tomorrow:
- Look up central surface brightness values (turn to Mateo and Martin for majority of them)
- Finish up readings

Wednesday, June 2, 2010

Very productive day!

Started out working on the data table, and fixed the little reference confusions from yesterday. From there went on to work with the luminosity comparison I've been doing. I'm very happy to report that they're all in a neat little table. I also played around with the where command and sorted my data. After so many error messages, I was very proud of my data sorting and my final luminosity comparison table.

After lunch, I talked to Beth about the distance calculation I'm going to be working on. It's a calculation to help determine the distances for M31 and MW satellites so that we can analyze M31 distances from the center of the galaxy to the satellite itself. Though it seems slightly daunting, I'm not too worried at this point. I have a feeling that while it may be frustrating for the next few days, it will be very satisfying by the end. Also talked about binary tables with Beth and Maya.

Also, I'm proud to say that every galaxy we have listed in our data table has RA and DEC values. Yay! I kind of feel like I'm just constantly adding bits and pieces to this data table; it feels really great to say that all of the RA and DEC values are in the table and accounted for.



things to do:
go over luminosities and source for systematic error
work new calculation
readings (2 journal articles and ch. 3 in book)

Tuesday, June 1, 2010

Today I went through the Wolf, et al references, and from those articles, updated our list further. I know it doesn't sound like a lot, but it took the majority of the morning. Then met up with Beth later and discussed the data table sources we're using, which ones we prefer, back-up sources for our data, etc.

From there, I went through our references and made sure that each reference was calculated in that paper, to make sure we were citing the source directly. Almost all of them were direct references - yay! But there's a complicated line of references from Roychowdhury to Begum and then Begum to Karachentsev. Gotta figure that out. Along those lines, I also have to replace Martin (2) reference with Begum (2008) citation. (Also, check and see if subscription to royal astron. society expired.)

I liked the team meeting. It was nice to hear what everyone else was up to. Also just a nice way to reflect on the work and progress of the past week.

things for tomorrow, reference-wise:

- enter in data from Martin (2008) and Mateo (1998) -- with this, we'll have data for the majority of galaxies in our chart, even if they aren't the most recent
- replace Martin(2) ref with Begum (2008), and see if e and pos. angle are included in Begum article
- go over Karachentsev refs along with Begum string of refs

outside of citation stuff:
- go over luminosities and do log and print in order to further compare calculated and gathered lums
- go over where command
- distance calculation with beth (excited for first sig calculation!)
- read articles Beth gave us
- read ch. 3 of book

Monday, May 31, 2010

pretty IDL plots

Things I did today:

- Formatted RA/DEC columns
- Fixed comparison tables between actual and real values of luminosity (most of them look pretty accurate!)
- Made pretty IDL plots. With Maya's help, I successfully created a plot and changed the color of the points, axes, as well as changing shapes of the actual points itself. I know it's not much, but it feels really good to actually have done it without error messages flooding my screen.

I think I'm going to try making another plot and messing around with colors and stuff before I leave.

Things to do (for tomorrow and beyond):
- Take a closer look at those luminosity values
- Check Wolf, et al references more clearly and go through his refs to check values that came from sources he didn't cite
- Move onto making even prettier data plots!

Friday, May 28, 2010

End of my first week

Great past couple of days!

Yesterday I entered my newly found position data into the huge data table and added my notes, from details about to authors and their works to specific values to later go back and verify. I went through the references from Wolf, et al and with the exception of 2 authors, every author was a main author of at least one work cited in the article (yay). I'm planning on going back and checking the data that wasn't referenced in that article and adding more to the table.


Five of us also attended Beth's lecture at Drexel yesterday afternoon. Really great talk. It was very cool to see Beth in an environment outside of Haverford, and see her speak on the topic of our research this summer. It's pretty exciting to know that we're working on such an up-and-coming aspect of astronomy. I was really reminded of that all throughout Beth's lecture and when hearing the questions people asked her afterwards.


Today I focused on doing some reading and then spent lots of time working with IDL to compare gathered and calculated values of luminosity and magnitude, in order to verify the accuracy of the gathered data. I did this through a little IDL function. After very frustrating error messages, I finally got the values I wanted.....but then I attempted to organize these values in a neat, easy-to-read style, and even with Beth literally writing down the coding, help from books and the internet, as well as advice from Maya and Oliver, I have still yet to get this data in two columns. Until Monday....


Although I'm slightly frustrated with IDL at the moment, looking back, I'm extremely pleased with this past week and how far I've come since then. I really hope I still feel this excited about the work that we're doing as the summer continues.

Goals for next week:

-become awesome at IDL -- specifically with plotting graphs, and changing plot formats (hopefully working up to error lines...?)
-clean up new data table and add central surface brightness column
-figure out this column business with luminosity and absolute magnitude

(note to self: ask Beth about alternate names for andromeda, and add ursa major from zucker)


On a side note, I had a great talk with Maya and Oliver about physics and astro at Haverford. I was feeling kind of overwhelmed about the next few years, and they really made me feel better about it all. It's been really nice having Maya, Oliver and Gail around to help me out with programming stuff, and just listening to their astro/physics experiences so far.

Looking forward to the weekend!

Wednesday, May 26, 2010

I think the SAO/NASA ADS astronomy query form has become my new best friend...

Today I gathered information on galaxy properties such as RA, DEC, position and ellipticity, to add to our giant data table of dwarf galaxy information. Right now, we have about 40 or so galaxies in our table.

As of now, all but two galaxies have RA and DEC data, and about half of them have RA, DEC, angle position and ellipticity as well! Along with that, I've noted down each author and paper that this data came from, and inadvertently created a list of dwarf galaxies and their pseudonyms.

I started out just sifting through different references from the Wolf, et al paper. But this was really tedious and didn't get me much data. I ended up finding a few authors who had discovered a lot of galaxies, and searching through their papers and which were most cited. In order to fill in the gaps, I looked up when the most recent galaxies were discovered, and then narrowed my searches to that time period. I also got a lot of references through SIMBAD which I would follow to journal articles and get data from.

As of now, I have about 10 pages of notes that I have to write up on the computer. I figured that editting the data table would be slow work for me, so I should gather as much info as I could initially and then add it to the actual table. Hopefully my limited coding knowledge will get me through adding new columns and data without too much trouble.

Feeling really good about this. Who knows? Maybe this compiled data, once plotted and graphed, will show us something new about stealth galaxies.

Tuesday, May 25, 2010

color magnitude diagrams

Worked on graphing with IDL early on in the morning, using data from the ASTR206 lab. Not the most successful, but hey, figuring out the error messages is a good way to learn.

Spent time reading about apparent and absolute magnitudes before meeting up with Beth and Maya to discuss color magnitude diagrams. They are so neat! Especially when it came to the different ways to filter these diagrams. I feel like there is so much potential in discovering/learning about these faint dwarf galaxies when it comes down to the many ways to filter these diagrams, especially with new scans of the sky that will be available within the next 5 years or so. Once again, I realized just how fast the field of astronomy is moving. Spent the rest of the time reading the Kalirai journal article. Good stuff.

Monday, May 24, 2010

First Day of Work!

Today is my first day of work at Haverford College working with Beth Willman and the gang doing research on stealth galaxies!

The day started off with a meeting with Beth, Maya and I, where we discussed what types of research we'll be doing this summer, what goals we hope to achieve, etc, along with going over a background article we read.

(Here's the article: http://www.hindawi.com/journals/aa/2010/285454.html)

Over the course of the next few months, (in a nutshell) we are aiming to be completely up to date with current research on stealth galaxies, and create a model for where we think stealth galaxies are. More short-term goals include getting caught up on the literature, and getting familiar with IDL and Unix.

One of my personal goals for the summer is to gauge the extent of my interest in astronomy. Though I have only limited experience in astronomy, I like what I know so far. I feel like this is my summer of "real-world astronomy" -- doing research and computer programming. I feel like if I'm still really interested in astronomy after this summer, then I think its something I could see myself doing in the future.

Today was a lot of learning and catching up all at once. After our initial meeting, I spent the majority of the time getting used to unix and playing around with IDL. Suprisingly enough, I'm finding that the computer coding isn't too frustrating. It's pretty cool actually. Even just from one day of tinkering around with it, I feel like I've learned a lot. I can see how after a fair amount of practice, it's something I could get pretty comfortable with.

I went through the Unix and IDL tutorials along with part of a virtual lab from the ASTR206 course. I created different directories in my squid home and copied and moved them around. I spent a while figuring out how to apply the interactive rm feature, and finally got it after lots of erasing and re-copying of files. I practiced plotting random data and then real data from one of the labs.

I also spent some time reading and taking notes from the Ostlie astrophysics textbook, learning about apparent and absolute magnitudes along with luminosity ratios. All neat stuff.

All in all, a great first day! I'm very excited for this summer and what will come of our research.