Human Generated Data

Title

Untitled (woman holding up tea kettle surrounded by other women and gifts)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4652

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman holding up tea kettle surrounded by other women and gifts)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4652

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Chair 99.6
Furniture 99.6
Person 99.4
Human 99.4
Person 98.6
Person 97.8
Person 95.8
Person 94.7
Meal 93.9
Food 93.9
Face 92.9
Clothing 92.5
Apparel 92.5
Sunglasses 88.5
Accessories 88.5
Accessory 88.5
Person 87.6
Tablecloth 87.5
Crowd 83.5
Restaurant 82.8
Person 82.6
Person 81.7
Person 81.6
People 81.2
Dish 79.9
Table 78.3
Indoors 77
Female 73.5
Portrait 73.5
Photography 73.5
Photo 73.5
Dining Table 72.3
Home Decor 71.4
Person 70.1
Dress 68.8
Room 67.8
Shorts 67.5
Child 65.6
Kid 65.6
Man 64.2
Girl 63.2
Outdoors 62.3
Audience 62.3
Leisure Activities 60.3
Smile 59.5
Woman 58.5
Overcoat 57.6
Suit 57.6
Coat 57.6
Sitting 56.8
Linen 56.5
Party 55.9

Clarifai
created on 2023-10-15

people 99.8
group 99.1
child 98.3
man 96.8
many 96.2
group together 95.3
crowd 94.4
woman 92.8
boy 92.2
adult 91.1
education 91
war 90.1
school 89.7
music 88.5
administration 87.2
dancing 84.7
adolescent 84
sit 82.4
recreation 81.6
audience 77.6

Imagga
created on 2021-12-14

business 24.3
male 22.7
man 19
newspaper 17.7
drawing 17.6
people 17.3
person 17
grunge 17
silhouette 16.5
chart 16.2
businessman 15.9
negative 15.8
product 14.6
office 14.5
plan 14.2
creation 13.9
classroom 13.8
design 13.5
team 13.4
work 13.4
diagram 13.4
old 13.2
room 12.9
sign 12.8
technology 12.6
paper 12.6
film 12.3
education 12.1
construction 12
finance 11.8
vintage 11.6
retro 11.5
sketch 11.3
success 11.3
human 11.2
manager 11.2
art 11
symbol 10.8
building 10.6
gymnasium 10.5
group 10.5
men 10.3
architecture 10.2
financial 9.8
idea 9.8
black 9.6
blackboard 9.4
adult 9.4
youth 9.4
student 9.3
professional 9.1
hand 9.1
modern 9.1
aged 9
sky 8.9
computer 8.9
growth 8.8
teaching 8.8
boy 8.7
play 8.6
writing 8.5
athletic facility 8.4
portrait 8.4
power 8.4
graphic 8
looking 8
photographic paper 8
women 7.9
urban 7.9
child 7.8
graph 7.7
money 7.7
city 7.5
dollar 7.4
sport 7.4
letter 7.3
facility 7.3
currency 7.2
teacher 7.1
job 7.1

Google
created on 2021-12-14

Black-and-white 84.2
Style 83.9
Font 82.5
Shorts 80.3
Adaptation 79.4
Hat 77.2
Monochrome photography 74.2
Monochrome 72.9
Chair 72.8
Event 72.2
Art 72.1
Crew 67.8
Visual arts 64.5
Room 64.5
Stock photography 63.3
T-shirt 62.6
Team 60.4
Sitting 60.2
Photo caption 58.6
History 57.8

Microsoft
created on 2021-12-14

text 98.9
drawing 92.1
person 90.8
clothing 86.6
man 80.2
window 80.2
black and white 79.8
sketch 79.2
cartoon 78.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-32
Gender Male, 67.6%
Calm 82.1%
Surprised 14.4%
Confused 1.4%
Happy 1.2%
Sad 0.5%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 26-42
Gender Female, 73.4%
Happy 56.4%
Calm 25.6%
Sad 10.9%
Surprised 2.8%
Fear 2.5%
Confused 1%
Angry 0.7%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Male, 53.5%
Calm 95.9%
Sad 2.7%
Happy 1.3%
Angry 0.1%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 22-34
Gender Female, 96.3%
Happy 50.2%
Calm 21.7%
Sad 13.6%
Confused 6.4%
Angry 4.5%
Surprised 2.6%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 22-34
Gender Male, 65.3%
Calm 61.8%
Angry 13.1%
Sad 10.5%
Surprised 8.6%
Happy 3.4%
Confused 1.9%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 21-33
Gender Male, 76.3%
Calm 44.3%
Sad 38.6%
Happy 9.8%
Angry 4.4%
Confused 1.8%
Fear 0.4%
Surprised 0.4%
Disgusted 0.2%

AWS Rekognition

Age 45-63
Gender Female, 78%
Calm 88.7%
Sad 9.9%
Surprised 0.5%
Confused 0.2%
Happy 0.2%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-37
Gender Female, 59.6%
Sad 46.7%
Calm 39.4%
Happy 6.7%
Angry 2.8%
Fear 2.5%
Confused 0.8%
Surprised 0.7%
Disgusted 0.3%

AWS Rekognition

Age 30-46
Gender Female, 71.4%
Sad 66.2%
Calm 14.3%
Happy 8.4%
Angry 6.4%
Confused 2.5%
Fear 1%
Disgusted 0.8%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Sunglasses 88.5%

Categories

Imagga

paintings art 80.1%
people portraits 19%

Text analysis

Amazon

25200
KODVK--80