Human Generated Data

Title

Untitled (parents walking with three children on sidewalk)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8784

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (parents walking with three children on sidewalk)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Shorts 100
Clothing 100
Apparel 100
Person 99.7
Human 99.7
Person 99.6
Person 99.4
Shoe 96.9
Footwear 96.9
Person 92.2
Shoe 88.6
Person 86.7
Plant 80.9
Tree 79.1
Shoe 74.1
Female 69.6
Arecaceae 69.1
Palm Tree 69.1
Face 66.1
Portrait 64.6
Photo 64.6
Photography 64.6
Path 62.5
People 61.2
Shoe 59.6

Imagga
created on 2022-01-09

musical instrument 24.9
man 23.5
person 22
people 19
adult 15.7
male 12.8
stage 12.1
world 11.4
fashion 11.3
wind instrument 11.2
percussion instrument 11.1
business 10.9
happy 10.6
couple 10.4
portrait 10.3
hair 10.3
performer 10.2
lifestyle 9.4
architecture 9.4
old 9
sport 8.7
face 8.5
accordion 8.5
black 8.5
clothing 8.5
hat 8.4
work 8.2
human 8.2
building 8.2
art 7.9
happiness 7.8
color 7.8
expression 7.7
power 7.6
city 7.5
musician 7.5
industrial 7.3
sexy 7.2
banjo 7.1
keyboard instrument 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.1
footwear 88.2
clothing 86
person 84.3
posing 71.6
black and white 53.1

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 91.9%
Calm 78.4%
Happy 12.1%
Surprised 5.8%
Confused 1.7%
Sad 0.6%
Disgusted 0.5%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 53-61
Gender Male, 91.2%
Calm 87.3%
Sad 7.9%
Confused 2.9%
Surprised 1%
Happy 0.3%
Fear 0.2%
Disgusted 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 96.9%

Captions

Microsoft

a group of people posing for a photo 89.3%
a group of people posing for a picture 89.2%
a group of people posing for the camera 89.1%

Text analysis

Amazon

8SD

Google

8SA
8SA