Human Generated Data

Title

Untitled (man playing accordian for a group of people sitting on the grass)

Date

c. 1965

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11465

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man playing accordian for a group of people sitting on the grass)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Human 99.8
Person 99.8
Person 99.4
Person 98.9
Person 98.8
Person 97.6
Person 97.2
Person 95.6
Person 92.4
Animal 90.3
Dog 90.3
Mammal 90.3
Canine 90.3
Pet 90.3
Musical Instrument 85.3
Indoors 84.3
Interior Design 84.3
Accordion 80
Transportation 79.5
Automobile 79.5
Vehicle 79.5
Car 79.5
Car 79.5
Face 75.9
Machine 71.8
Wheel 71.8
Person 71.8
Musician 71.8
People 71.5
Chair 68.7
Furniture 68.7
Leisure Activities 65.2
Portrait 62.1
Photo 62.1
Photography 62.1
Clothing 59
Apparel 59
Overcoat 59
Suit 59
Coat 59
Shorts 55.2

Imagga
created on 2022-01-14

wind instrument 100
accordion 100
musical instrument 100
keyboard instrument 100
people 26.8
man 25.5
adult 25.3
male 22
person 18.6
outdoor 16.8
lifestyle 16.6
silhouette 15.7
outdoors 15.7
sport 15.7
portrait 15.5
sunset 15.3
black 13.8
play 13.8
summer 13.5
sky 13.4
grass 12.7
happy 12.5
joy 12.5
field 11.7
couple 11.3
attractive 11.2
love 11
relax 11
leisure 10.8
active 10.8
run 10.6
lady 10.5
outside 10.3
day 10.2
body 9.6
boy 9.6
men 9.4
freedom 9.1
pretty 9.1
fashion 9
sexy 8.8
together 8.8
women 8.7
happiness 8.6
sitting 8.6
model 8.6
youth 8.5
art 8.5
free 8.4
beach 8.4
teen 8.3
holding 8.3
fun 8.2
teenager 8.2
playing 8.2
girls 8.2
exercise 8.2
suit 8.1
player 7.9
hand 7.6
dark 7.5
park 7.4
sun 7.2
smiling 7.2
fitness 7.2
music 7.2
meadow 7.2
romance 7.1
smile 7.1
romantic 7.1
face 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

outdoor 88.2
text 71.1
person 69
vehicle 67.7
car 62.2
black and white 61.2
land vehicle 50

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 96.4%
Calm 98.5%
Sad 0.5%
Confused 0.5%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.8%
Calm 76.2%
Sad 9.2%
Confused 6.6%
Surprised 2.5%
Angry 2.4%
Happy 1.8%
Disgusted 0.9%
Fear 0.4%

AWS Rekognition

Age 31-41
Gender Female, 82.6%
Calm 59.2%
Sad 21%
Confused 16.4%
Disgusted 1%
Angry 1%
Happy 0.6%
Surprised 0.5%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Female, 64.2%
Calm 99.6%
Sad 0.2%
Confused 0.1%
Disgusted 0%
Surprised 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 100%
Disgusted 55.5%
Angry 17.8%
Happy 7.1%
Calm 5.9%
Sad 4.5%
Surprised 3.8%
Confused 2.7%
Fear 2.7%

AWS Rekognition

Age 54-62
Gender Male, 96.8%
Calm 80.9%
Happy 10.8%
Sad 6%
Disgusted 0.7%
Angry 0.5%
Confused 0.5%
Surprised 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Dog 90.3%
Car 79.5%
Wheel 71.8%

Captions

Microsoft

a group of people in an old photo of a person 87.8%
a group of people around each other 87.6%
a group of people posing for a photo 87%

Text analysis

Amazon

45652
KODVK--S.VEEIA--1

Google

4 5652 MJI7--YT37A°2 -- AQox
4
5652
AQox
MJI7--YT37A°2
--