Human Generated Data

Title

Untitled (men in suits and hats gathered around table)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4931

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men in suits and hats gathered around table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.4
Person 99.2
Person 98.6
Person 97.6
Person 95.7
Person 90.2
Clinic 82.8
Lab 78.9
Building 74.1
People 73.9
Accessories 61.7
Sunglasses 61.7
Accessory 61.7
Scientist 57.6
Chef 55.6

Imagga
created on 2022-01-23

drawing 38.5
sketch 26.6
design 23.1
silhouette 21.5
art 21.4
grunge 19.6
facility 18.7
negative 17.4
retro 16.4
pattern 15.7
business 14.6
archipelago 14.4
web site 14
graphic 13.9
treasury 13.4
vintage 13.2
people 12.8
depository 12.6
land 12.6
old 12.5
element 12.4
gymnasium 12.3
film 12.2
frame 12
black 12
plan 11.3
paper 11.3
product 11.2
representation 11
architecture 10.9
man 10.8
creation 10.6
construction 10.3
decoration 10.1
symbol 10.1
paint 10
photographic paper 9.4
set 9.3
clip art 9.3
male 9.2
decorative 9.2
athletic facility 9.1
texture 9
equipment 9
team 9
backgrounds 8.9
cartoon 8.9
bank 8.3
pencil 8.3
style 8.2
person 7.7
line 7.7
floral 7.7
house 7.5
antique 7.3
dirty 7.2
painting 7.2
creative 7.1
modern 7

Google
created on 2022-01-23

Coat 91.3
Hat 88.8
Chef 86.3
Food 86.3
Black-and-white 82.4
Cooking 80.1
Apron 75.7
Monochrome photography 74.7
Kitchen 74.2
Monochrome 73.1
Window 71.7
Cook 71.6
Service 69.3
T-shirt 67.2
Vintage clothing 66.1
Event 65.1
Fast food 64.7
Job 63.7
Science 62.4
Kitchen appliance 61.6

Microsoft
created on 2022-01-23

text 99
clothing 93
person 91.4
window 86
drawing 80.5
man 73
sketch 54.3
posing 40.8

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 96.7%
Calm 48%
Confused 30.6%
Happy 11.1%
Sad 2.6%
Disgusted 2.5%
Angry 2.3%
Surprised 2.1%
Fear 0.9%

AWS Rekognition

Age 50-58
Gender Male, 91.8%
Calm 37.1%
Disgusted 33.3%
Fear 9.1%
Sad 6.7%
Angry 6.6%
Happy 4.6%
Surprised 1.5%
Confused 1%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Sad 34.2%
Calm 33.7%
Confused 14.3%
Surprised 4.8%
Disgusted 4.2%
Happy 3.8%
Angry 3.6%
Fear 1.5%

AWS Rekognition

Age 42-50
Gender Male, 97.6%
Calm 46.2%
Happy 31.6%
Surprised 11.3%
Angry 3.6%
Confused 2.4%
Sad 2.1%
Disgusted 1.6%
Fear 1.2%

AWS Rekognition

Age 43-51
Gender Male, 97%
Sad 91.8%
Happy 3.4%
Confused 1.7%
Fear 1%
Calm 0.8%
Angry 0.5%
Disgusted 0.4%
Surprised 0.4%

AWS Rekognition

Age 21-29
Gender Female, 80.7%
Calm 75.2%
Happy 12.9%
Sad 4.2%
Fear 3.3%
Angry 2.6%
Disgusted 0.8%
Surprised 0.6%
Confused 0.3%

AWS Rekognition

Age 23-31
Gender Male, 74.5%
Calm 97.6%
Sad 1%
Confused 0.5%
Angry 0.3%
Surprised 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Sunglasses 61.7%

Captions

Microsoft

a group of people posing for a photo 68.2%
a group of people posing for the camera 68.1%
a group of people posing for a picture 68%

Text analysis

Amazon

10731.

Google

.....
10731.
10731. 10731. .....