Segfault on variable, that is used identically to another.

A place to discuss the implementation and style of computer programs.

Moderators: phlip, Moderators General, Prelates

gametaku
Posts: 148
Joined: Tue Dec 30, 2008 2:21 am UTC

Segfault on variable, that is used identically to another.

Postby gametaku » Tue Mar 20, 2012 7:43 pm UTC

using the MPICH2 implemtation of MPI , with a GCC v 4.11 compiler,

Language: C



I'm getting segfault caused by the variable: inputVec2 and can't figure out why since it's used identically to inputVec1.

Code: Select all

void  getData(const int size, const int rank,  int *vecLength, double **vec1, double **vec2, int *scale1, int *scale2)
{

   int inp[3];
   int l;
   double *inputVec1;
   double *inputVec2;

   if(rank == 0){
      printf("How  many elements for each proc?\n");
      scanf("%d", &l);

      inp[0] = l * size;
      printf("Length of vectors: : %d\n", inp[0]);

      fflush(stdout);

      if(vec1 == NULL || vec2 == NULL){
         printf("Error getting data");
         return;
      }

      inputVec2 = (double*) malloc(inp[0] * sizeof(double));
      inputVec1 = (double*) malloc(inp[0] * sizeof(double));

   

      if(inputVec1 == NULL || inputVec2 == NULL){
         printf("Error getting data");
         return;
      }

      printf("Enter vector 1 elements: \n");
      int i;
      for(i = 0; i < inp[0]; i++)
      {
         scanf("%lf", (inputVec1 + i));
      }

      for(i = 0; i < inp[0]; i++)
      {
         printf("%lf\n", *(inputVec1 + i));
      }

      fflush(stdout);
      printf("Enter vector 2 elements:\n");
      for(i = 0; i < inp[0]; i++)
      {
         scanf("%lf", (inputVec2 + i));
      }
      for(i = 0; i < inp[0]; i++)
      {
         printf("%lf\n", *(inputVec2 + i));
      }

      printf("Enter 2 scallers: \n");
      fflush(stdout);
      scanf("%d %d", &inp[1], &inp[2]);
   }
   MPI_Bcast(inp, 3, MPI_INT, 0, MPI_COMM_WORLD);   

   *vecLength = inp[0]/size;
   *scale1 = inp[1];
   *scale2 = inp[2];

   int secVecSize = *vecLength * (sizeof *vec1);
   *vec1 = malloc(secVecSize);
   *vec2 = malloc(secVecSize);

   

   MPI_Scatter((*vec1), *vecLength, MPI_DOUBLE, inputVec1, *vecLength, MPI_DOUBLE, 0, MPI_COMM_WORLD);

 // This is the line that causes the segfault
   MPI_Scatter((*vec2), *vecLength, MPI_DOUBLE, inputVec2, *vecLength, MPI_DOUBLE, 0, MPI_COMM_WORLD);


   if(rank == 0){
      free(inputVec1);
      free(inputVec2);
   }

}

User avatar
Yakk
Poster with most posts but no title.
Posts: 11128
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Re: Segfault on variable, that is used identically to anothe

Postby Yakk » Tue Mar 20, 2012 9:04 pm UTC

I didn't spot anything obvious.

The scale1/scale2 variables aren't used after being read from input, and the size variable doesn't seem to have an obvious meaning.

The obvious cause would be that you screwed up on the first vec, but because of how you allocated it there was room after it?

Glancing at MPI_Scatter, it appears you are using a multiprocessing framework? And that the processes that get the messages are running "the same code"?

I don't see where you guarantee that the processes have gotten the data in your buffer before you delete them. Is this guaranteed by the time the function returns?
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

gametaku
Posts: 148
Joined: Tue Dec 30, 2008 2:21 am UTC

Re: Segfault on variable, that is used identically to anothe

Postby gametaku » Tue Mar 20, 2012 9:57 pm UTC

Yakk wrote:I didn't spot anything obvious.

The scale1/scale2 variables aren't used after being read from input, and the size variable doesn't seem to have an obvious meaning.

scale1/scale2 are output parameters.

size is the size of number of nodes on the cluster.

The obvious cause would be that you screwed up on the first vec, but because of how you allocated it there was room after it?

Hmm, I'll see if I can figure something about that.


Glancing at MPI_Scatter, it appears you are using a multiprocessing framework? And that the processes that get the messages are running "the same code"?


Well a cluster, and yes they are all running the same code Thus the using of if(rank==0) to only run certain code on code on the 0th node.


I don't see where you guarantee that the processes have gotten the data in your buffer before you delete them. Is this guaranteed by the time the function returns?

I'm fairly sure it's safe to delete, and regardless I'm not reaching that point in the code before the segfault.

User avatar
Sc4Freak
Posts: 673
Joined: Thu Jul 12, 2007 4:50 am UTC
Location: Redmond, Washington

Re: Segfault on variable, that is used identically to anothe

Postby Sc4Freak » Tue Mar 20, 2012 10:24 pm UTC

I think you messed up your call to MPI_Scatter. I've never used this library before, but it looks like some documentation is here:

http://mpi.deino.net/mpi_functions/MPI_Scatter.html
An alternative description is that the root sends a message with MPI_Send(sendbuf, sendcount * n, sendtype, ...).


So it looks like the number of elements in vec1/vec2 should be vecLength * size. Since vec1 and vec2 are too short, MPI_Scatter is reading off the end of the array and causing a segfault.

But by the looks of it, you're sending garbage anyway. You never initialize vec1/vec2, and you write your results into your inputVec1 and inputVec2. I think you might have gotten those switched around.

But like I said, I've never used this library before, so I may be way off.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11128
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Re: Segfault on variable, that is used identically to anothe

Postby Yakk » Wed Mar 21, 2012 3:37 am UTC

Laugh, ya:

Code: Select all

int MPI_Scatter(
  void *sendbuf,
  int sendcnt,
  MPI_Datatype sendtype,
  void *recvbuf,
  int recvcnt,
  MPI_Datatype recvtype,
  int root,
  MPI_Comm comm
);

SEND buff, sendcnt are arg 1 and 2. So your input vec should be arg1, and sendcnt should be... l? Which is ... inp[0]/size? (how many to send to each target).

The recievers get their data in recvbuf. That is *vec1 and *vec2.

Code: Select all

MPI_Scatter(inputVec1, *vecLength, MPI_DOUBLE, (*vec1), *vecLength, MPI_DOUBLE, 0, MPI_COMM_WORLD);
MPI_Scatter(inputVec2, *vecLength, MPI_DOUBLE, (*vec2), *vecLength, MPI_DOUBLE, 0, MPI_COMM_WORLD);

which should do what you want?

One thing I find awkward is forcing whomever does the input to know how many processes there are. I guess solving that is annoying, and you are doing this for some narrow special case, but I'd want to do it.
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.


Return to “Coding”

Who is online

Users browsing this forum: Baidu [Spider], Dason and 9 guests