Human Generated Data

Title

Untitled (couple shopping at hat stand in Montigo Bay, Jamaica)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8959

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple shopping at hat stand in Montigo Bay, Jamaica)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8959

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.8
Person 99.6
Clothing 99.2
Apparel 99.2
Shoe 98.8
Footwear 98.8
Shorts 95.7
Shoe 88
Shoe 84.7
Female 74.3
People 70.8
Dress 67.6
Meal 66.9
Food 66.9
Face 64.6
Plant 63.4
Shoe 63.3
Person 61.5
Leisure Activities 61.3
Girl 59.8
Costume 59.1
Suit 57.5
Coat 57.5
Overcoat 57.5
Accessories 55
Accessory 55

Clarifai
created on 2023-10-25

people 99.9
music 97.7
group 97.5
wear 97.2
group together 97.1
woman 96.4
man 96.1
adult 95.8
many 95.7
musician 93
child 92.1
monochrome 91.7
drum 89
outfit 88.8
recreation 88.7
dancing 88.1
street 87.8
instrument 87.5
several 84.5
percussion instrument 84

Imagga
created on 2022-01-09

musical instrument 44.3
wind instrument 27.7
accordion 22.9
people 22.3
person 21.6
man 21.5
adult 20.3
brass 18.7
keyboard instrument 18.4
percussion instrument 14.3
male 13.5
human 13.5
newspaper 13.3
portrait 12.3
men 12
sport 11.5
symbol 11.4
protection 10.9
product 10.5
car 10.3
silhouette 9.9
business 9.7
mask 9.6
city 9.1
outdoors 9
urban 8.7
steel drum 8.5
outdoor 8.4
fun 8.2
creation 8.2
danger 8.2
industrial 8.2
work 8.1
dirty 8.1
transportation 8.1
activity 8.1
success 8
day 7.8
black 7.8
military 7.7
clothing 7.7
hand 7.6
life 7.6
street 7.4
office 7.2
sexy 7.2
equipment 7.2
game 7.1
face 7.1
job 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.4
clothing 97.6
person 97.3
man 82.1
footwear 79.3
woman 67.4
posing 41.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 95.4%
Calm 96.8%
Sad 2.4%
Fear 0.3%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 16-22
Gender Female, 96.9%
Calm 44.8%
Happy 21.2%
Confused 9.7%
Disgusted 8.9%
Sad 7.4%
Angry 3.3%
Fear 2.8%
Surprised 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.8%
Shoe 98.8%

Categories