A Conversation with V.P. Godambe

Friday, March 12, 2010

A CONVERSATION WITH

V.P. Godambe, SSC 1987 Gold Medalist

Vidyadhar P. Godambe
 
Brajendra Sutradhar and Vidyadhar Godambe. Photo by L. Cowen


Brajendra Sutradhar and Vidyadhar Godambe.
photo by L. Cowen
 

Vidyadhar P. Godambe was born on June 1, 1926, in Poona, a city near Bombay, India. He received the M.Sc. degree in Statistics from Bombay University in 1950 and the Ph.D. from the University of London in 1958. From 1951 to 1955, he was a Research Officer in the Bureau of Economics and Statistics of the Government of Bombay. Following a year as Visiting Lecturer at the University of California at Berkeley (1957-8) and a year as Senior Research Fellow at the Indian Statistical Institute in Calcutta (1958-9), he became Professor and Head of the Statistics Department at Science College in Nagpur. He was promoted to the position of Professor and Head of the Statistics Department in the Institute of Science, Bombay University, in 1962. In 1964, he left India for North America, becoming for one year a Research Statistician at the Dominion Bureau of Statistics in Ottawa. After subsequent Visiting Professorships at Johns Hopkins University and the University of Michigan, he joined the University of Waterloo Department of Statistics in 1967, and settled there.
 

Professor Godambe is a Fellow of the American Statistical Association, the Institute of Mathematical Statistics, and the Royal Statistical Society, and a Member of the International Statistical Institute. He is the recipient of the 1987 Gold Medal of the Statistical Society of Canada.
 

The following conversation was recorded 1 May 1988, at the University of Waterloo, and originally appeared in Liaison Vol. 2, No. 3, Spring-Summer 1988.
 

L = Liaison G = Godambe
 

L. You didn’t start out to be a statistician.
 

G. That is true, but I cannot say specifically what I started to be, because there were many interests. One was painting, then on a more academic side Sanskrit, philosophy, theoretical physics, and mathematics. It was rather clear that I would go for an academic profession, though the rarity of the job opportunities made me take, briefly in my life, routes which did look non-academic.
 

L. Then how did you become interested in statistics?
 

G. I told you the job opportunities were very rare in India in those days. But there was great demand for statisticians primarily because of the Five Year Plans and because Mahalanobis, then Director of the Indian Statistical Institute, gave a big impetus to the use of statistics in conducting surveys. To make two ends meet, namely my interest in theoretical subjects, and my interest in securing a job which would give a good livelihood, I thought maybe I would go for statistics.
 

After my Master’s, though there were jobs for statisticians, even those opportunities were far fewer than they are now, and most of my colleagues joined some local colleges as teachers. And ordinarily, I also would have taken a job in a local college. but fortunately—I must say fortunately here—I fell ill of some unidentified illness for about six months. When I was out of bed and looking for jobs, all the colleges around had already filled their vacancies, so I applied for a job in the Bureau of Economics and Statistics in Bombay, and to my surprise I was hired with salary double what I applied for. I practically had no routine work defined for me. so I could study sampling and see what I could do with the problems that the Bureau was facing.
 

L. Was that when you first started to think seriously about sampling?
 

G. Yes. I had time, and the Director of the Bureau, Mr. Sankpal, encouraged me a lot. In fact, he wanted to promote me to some high position. Unfortunately, because I was so junior, and in government seniority counts, I just had to be contented with the position I had. Again, after Mr. Sankpal took up a UN job in Rome he tried to take me there but did not succeed because I was junior. With these failures I felt very bad at that time. Looking back, I think it was a good thing. I decided to leave the Bureau, to pursue my interest in the foundations of statistics and to get my Ph.D. degree.
 

L. What made you think of going to Imperial College for your Ph.D.?
 

G. George Barnard was there. I had read some of his comments and papers and we also had a common acquaintance who was my teacher, Dr. K.S. Rao. He had then just visited George Barnard at Imperial College and he helped me to get in touch with him. After initial discussions with Barnard, I decided to keep aside sampling to think about other problems and that is what I did for the next three years.
 

I actually wrote the thesis in Berkeley. I was half way through the Ph.D. when Barnard wanted to go to India for two or three months. I also had an offer to go to Berkeley to teach. That would help me solve some of my economic problems. So I wrote the thesis in Berkeley and then, on the way back to India, I submitted it to Imperial College.
 

With Barnard, every week I would discuss statistical problems, basic fundamental statistical problems, and I was so much interested I thought I would never write my thesis. Going to Berkeley was like going to another planet. The atmosphere was so different because the legitimacy of statistical inference as a subject was not accepted there at that time. People there generally accepted Neyman’s theory of inductive behaviour, which would in effect deny this legitimacy. Apart from differences of opinion, I thought the intellectual atmosphere was more regimented and less free. This of course is not denying the great scholarship in specialized areas.
 

L. Following Berkeley, you went back to India, to the Indian Statistical Institute. Who was there at that time?
 

G. Professor J.B.S. Haldane was there for the whole year. It was a great privilege to be with the great man – I cannot say directly how he influenced me, but it was just an atmosphere he created which was so different by his existence. Full of life and challenges to the traditional unthinking ways of life. A person with whom I came in contact often was C.R. Rao. And at that time, Bahadur was there, Basu was there. The atmosphere generally was also stimulating.
 

One of the lighter moments I still remember: once, in the coffee room, Bahadur was grudgingly telling me that he could not sleep well and I was telling him possible ways of restoring his sleep. Haldane intervened from a distant corner—we were unaware that he was listening to our conversation—“Bahadur, do you want to sleep well? Then attend my lecture!” Actually, it was impossible to sleep during Haldane’s lectures for he was a very energetic speaker.
 

L And then you went to Nagpur, as Professor and Head of the Statistics Department.
 

G. That’s right. Very comfortable job. A big salary for those times, slowly moving life of Nagpur, and I felt thoroughly comfortable there. And at that time, the estimating equations paper (1) was published. I must tell you I was thinking of that subject right from the day I entered statistics. I think these ideas were also available with some physicists, that we should assume that the event of largest probability has happened : this itself implies inference about whatever unknown there is. I thought this was the ultimate principle of statistical inference which could not be further reduced to anything simpler. This principle, unlike the likelihood principle, is entirely sample space and distribution based. In my Ph.D. thesis, I have a discussion of modal inference, where I tried to develop this. But then soon I realized that the mathematics associated with the mode was too complicated and the one associated with expected values was far simpler. The actual optimality criterion for unbiased estimating equations I got when ]was working in the Indian Statistical Institute in Calcutta.
 

L Why did you leave the comfortable life of Nagpur?
 

G. I did not leave that comfortable life. I got a promotion and I went to Bombay as Professor for one year.
 

L And that is where you came in contact with V.M. Joshi.
 

G. That’s right. Joshi then was the Secretary of the whole Education Department, and I was Professor of Statistics in the Institute of Science which belonged to Education Department. He had several years before got Tripos degree and he wanted to do Ph.D. in statistics. He had heard of me and I had heard of him, because Joshi had made a record in BSc examinations of the University.
 

L. Then of course came the big break, going to North America again.
 

G. That also was somewhat accidental. Though I was transferred to Bombay and the Professorship was created for me, other things required for starting a statistics department were not there. I said why don’t I go away for some time, and the government allowed me leave, but then I stayed in North America. That’s how things are. I went first to the Dominion Bureau of Statistics with the help of lvan Fellegi, then Johns Hopkins University, University of Michigan, and ultimately to Waterloo.
 

L. What was the period at the Dominion Bureau of Statistics like?
 

G. That was good for me. Somehow, just as the 1955 paper (2) on foundations of sampling I wrote in the Bureau of Economics and Statistics, so the 1966 paper (3) (which I consider as important) showing that the likelihood function in sample surveys is independent of the design was thought out and written when I was in Dominion Bureau. Now whether it was coincidence or the atmosphere I don’t know.
 

L. Tell us about the Chapel Hill Symposium of 1968.
 

G. Actually before 1966, I gave talks on foundations of survey sampling in several universities, and though my 1955 result that an unbiased minimum variance estimate does not exist, in particular sample mean is not UMV in the survey sampling setup, was known to sample survey statisticians it was not known to the general theoretical statistical public. This result really was received by general statisticians of United States with great surprise, and that led to the Symposium.
 

It was ideal from my point of view, because the North Carolina people found money, people to organize it, and they invited all the people I wanted them to invite. I just went there as a guest, though the Symposium was organized at my suggestion. And that was really an important step in the development of the subject, because of the participation of so many statisticians who otherwise were not aware of what went on in sampling.
 

L This must have led to the 1970 Symposium on the Foundations of Inference at the University of Waterloo.
 

G. Yes, the earlier symposium on the foundations of sampling was such a success and so we thought of one on inference, and David Sprott of Waterloo was extremely enthusiastic about it. He supported the idea with everything, money and whatever was required. I think to this day there hasn’t been a comparable event in the history of statistics at all. Neyman was the opening speaker, and Bartlett was the banquet speaker. I tried to invite Jimmy Savage but he could not come. I would have liked him to come and that would have added to the occasion. I also tried to get Allan Birnbaum.
 

L You have been much influenced by Birnbaum’s study of the principles of inference.
 

G. Yes. He demonstrated for the first time that such study can be meaningful, if not fruitful, because it led to a conclusion, whether acceptable or not. I have said it that way in my obituary of Birnbaum. He created a new area of research in statistics.
 

L After the symposium, you went on leave to England.
 

G. I was in Sheffield. Joe Gani invited me. And during that time in 1971, the paper on Bayes, fiducial and frequency aspects of inference in sampling was read, in collaboration with Mary Thompson, at the Royal Statistical Society (4). I think it is a very clearly written paper and many people who studied it carefully liked it. It put for the first time fiducial probability in sampling framework. It also discussed other concepts, Bayesian and non-Bayesian, in sampling framework, and the extent to which they could be reconciled. Again this paper was interesting because the people who commented on it were not conventional survey statisticians, but people who did inference.
 

Following in the same spirit, we presented the paper on robust near-optimal estimation in survey at the New Delhi meetings of the International Statistical Institute in 1977 (5), and this led to more work on the likelihood principle and its relationship to randomization. Then the notion of robustness coming from randomization was further developed in my robustness and optimality paper which appeared in JASA in 1982 (6). This paper does not see robustness as in conflict with efficiency. It just says that you establish efficiency under more flexible conditions, and that is through appropriate randomization.
 

L. Of course these considerations are closely related to “Godambe’s paradox”.
 

G. Yes. When the analysis of the role of randomization was carried to its logical conclusion, I could see that there was a paradox involved. Apparently, using the same arguments in a very elementary situation, one is making inference about an unknown constant exclusively on the basis of the realized value of a random variate and its completely specified distribution. That is, mathematically the distribution is independent of the unknown constant. That was discussed in The Canadian Journal, and in JASA, JRSS and elsewhere (7, 8, 9). And I would like people to discuss it more because I myself do not clearly see the solution. I often think that just as Russell’s paradox ultimately was resolved in terms of analysis of language, similarly here, by properly restricting the definition of parameter, we could eliminate this one. But this is just a very obscure kind of feeling I have.
 

To come back to the context of sampling theory, as I mentioned before, the paper I published in JRSS 1966 demonstrated that the likelihood principle implies that inference should be independent of the sampling design in general. This led to the development of model theory in survey sampling. The proponents of this theory, Royall (10) and subsequently others, rejected all use of randomization frequencies at the inference stage. For them, the inference must follow strictly from the superpopulation model. I have no sympathy for this view. I firmly believe that all nontrivial inference would require both model probabilities and randomization frequencies. Looking at the development of model theory today, it looks as though its proponents use all sorts of excuses for using the sampling design, and still somehow in a religious way maintain their model theory. But all this had done some good; it has helped people understand the role of randomization in survey sampling better. Randomization has survived this attack and has emerged with new strength and new meaning.
 

L. What made you finally settle in Canada?
 

G. It looked like a combination of good things in England and good things in United States. That is, there was economic flexibility and also physical flexibility in the sense that Canada is a vast stretch of land compared to Europe, and you don’t feel crowded as you do in Bombay or Europe. Really I do not think I would have been as much at home elsewhere.
 

L. And the mechanics of living in Waterloo are so easy, except for...
 

G. Winter, that winter I could not get used to even after twenty years. I think it does some good to health although we dislike it so much.
 

L. ... and the swimming pool’s irregular hours!
 

G. Yes, that is my last ten year’s addiction. I swim almost every day. Most important, in Waterloo I found people who were congenial of temperament. Particularly my research temperament could be appreciated or understood at least by some of my colleagues. And I found a great collaborator in work...
 

L. Let us comeback to estimating equations and estimating functions. You started thinking again about the nuisance parameter situation in 1973.
 

G. Yes, stimulated by some lectures George Barnard was giving at the University of Waterloo, Mary Thompson and I obtained our first result for optimality in the presence of nuisance parameter in 1974 (11).
 

L. Then came the 1976 Biometrika paper (12) which showed optimality of conditioning on a statistic which is complete and sufficient for the nuisance parameter.
 

G. Yes, that is a clear result, how conditioning can be incorporated within the setup of estimating functions. It suggested that this simple technique can go a long way. And it suggested new approaches to the concepts of sufficiency and information in the presence of a nuisance parameter.
 

Meantime, we were also thinking about estimating functions in the context of nonparametric estimation of a mean and extensions of this, and I think our 1978 JSPI paper (13) for the first time showed the superiority of the estimating function approach in relation to both maximum likelihood and UMV estimation.
 

L. Then more recently came the result on finite sample estimation in stochastic processes. How did this come about?
 

G. I think again it was the atmosphere. Mary Thompson had arranged some informal seminars on stochastic processes, and I used to sit there, often. Also I was reading a few things, on martingales and the conditional least squares method. I tried to put those things in estimating function theory setup and things looked quite fruitful (14). Later Joe Gani told me that the formula I proposed had many applications. He showed me David Kendall’s work of long ago.
 

L. Then more recently you turned your attention to estimating functions in sampling.
 

G. Yes, in collaboration with Mary Thompson. Our main paper on this appeared in the International Statistical Review 1986 (15). I like this paper so much, and perhaps I think it is our most constructive paper.
 

L. It has many implications.
 

G. Many implications and applications. Yes. Sooner or later, people will use the results and it will actually influence the thinking of sample survey statisticians. It will take some time, because it will need adjustment in thinking of practitioners and theorists alike, and the implications for interval estimation are still to be realized.
 

L. What are you thinking about these?
 

G. Quasi-likelihood and related things are foremost in my mind these days. That is another area where estimating function theory has helped organize the material in a systematic way and further it.
 

L. It is interesting that by now the term estimating function has become almost a household word in the statistical community.
 

G . It is most satisfying to see the developments in this area at Waterloo. Some of my colleagues have found varied applications of the estimating function methodology, and others have given new interpretation providing a different perspective on the subject (16). I do not know where else I could have got more responsive colleagues.
 

L. Thank you, Professor Godambe.
 

References

  1. Godambe, V.P. ; Ann. Math. Statist., 31 (1960), 1208-1212.
  2. Godambe, V.P. ; J. Roy. Statist. Soc., Series B, 17 (1955), 269-278.
  3. Godambe, V.P. ; J. Roy. Statist. Soc., Series B, 28 (1966), 310-328.
  4. Godambe, V.P. & M.E. Thompson; J.Roy. Statist. Soc., Series B, 33 (1971), 361-390.
  5. Godambe, V.P. & M.E. Thompson; Bull. Int. Statist. Inst., 47 (1977), 129-146.
  6. Godambe, V.P. ; J. Amer. Statist. Assoc., 77 (1982), 393-406.
  7. Godambe, V.P. ; J. Amer. Statist. Assoc., 77 (1982), 931-933.
  8. Genest, C. & M.J. Schervish ; Canad. J. Statist., 13 (1985), 293-301.
  9. Bhave, S.V. ; Statist. Prob. Letters, 5 (1987), 243-246.
  10. Royall, R.M. ; Amer. J. Epidemiology, 104 (1976), 463-474.
  11. Godambe, V.P. & M.E. Thompson; Ann. Statist., 2 (1974), 568-571.
  12. Godambe, V.P. ; Biometrika, 63 (1976), 277-284.
  13. Godambe, V.P. & M.E. Thompson; Statist. Plann. Inf., 2 (1978), 95-104.
  14. Godambe, V.P. ; Biometrika, 72 (1985), 419-428.
  15. Godambe, V.P. & M.E. Thompson; Int. Statist. Rev., 54 (1986), 127-138.
  16. McLeish, D.L. & C.G. Small; Springer-Verlag Lecture Notes No. 44, 1988.