Human Generated Data

Title

Untitled (man with fishing pole surrounded by children)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4468

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man with fishing pole surrounded by children)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 99.7
Person 99.5
Person 99.5
Person 99.2
Clothing 92.7
Apparel 92.7
Shoe 90.8
Footwear 90.8
Shoe 90.5
Face 87.8
People 82.3
Person 74.1
Person 71.7
Kid 67
Child 67
Sailor Suit 64.1
Text 63.9
Girl 62.4
Female 62.4
Photography 61.7
Photo 61.7
Shoe 59.4
Shorts 58.6
Advertisement 56.5
Shoe 52.4
Shoe 51.5

Imagga
created on 2022-01-23

brass 47.3
musical instrument 39.5
wind instrument 39.5
cornet 33.4
man 24.8
person 21.5
violin 19.6
male 19.1
people 17.3
adult 15.8
bowed stringed instrument 14.9
sport 14.8
stringed instrument 13.8
active 13.5
art 12.6
statue 12.5
bugle 11.9
player 11.9
sculpture 11.9
exercise 11.8
portrait 11.6
men 11.2
marble 11.1
architecture 10.9
business 10.9
lifestyle 10.8
businessman 10.6
human 10.5
body 10.4
action 10.2
face 9.9
fitness 9.9
activity 9.8
fun 9.7
dress 9
health 9
new 8.9
happy 8.8
drawing 8.7
happiness 8.6
black 8.4
old 8.4
hand 8.3
color 8.3
traditional 8.3
outdoors 8.2
professional 8.1
game 8
life 7.9
couple 7.8
play 7.7
wall 7.7
attractive 7.7
sky 7.6
outdoor 7.6
two 7.6
fashion 7.5
lady 7.3
smiling 7.2
history 7.2
summer 7.1
trombone 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.8
posing 95
clothing 91.9
outdoor 87
standing 81.2
person 77.3
old 77
man 74.6
footwear 71.5
black and white 68.7
player 67.3
group 64.7
ship 54.7

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 93.4%
Calm 99.8%
Sad 0.1%
Confused 0%
Surprised 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Male, 95.7%
Calm 72.3%
Fear 13.5%
Sad 11.4%
Surprised 1%
Happy 0.5%
Angry 0.5%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Male, 61.3%
Happy 82.4%
Calm 11.3%
Angry 1.4%
Sad 1.3%
Confused 1.3%
Surprised 1.1%
Disgusted 0.7%
Fear 0.4%

AWS Rekognition

Age 23-33
Gender Female, 67.7%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Fear 0%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 90.8%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 95.9%
a vintage photo of a group of people posing for a picture 95.8%
a group of people posing for a photo 95.7%

Text analysis

Amazon

38762
58
YT37A°-C

Google

38762
38762