Human Generated Data

Title

Untitled (two sailors and two women sitting at restaurant table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4999

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two sailors and two women sitting at restaurant table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.7
Person 99.7
Person 99.3
Person 99.1
Person 98.9
Person 97.9
Person 97.6
Person 94.7
Restaurant 88.5
Drink 72.7
Beverage 72.7
Text 71.5
Cafeteria 70.9
Shop 60.4
Meal 57.7
Food 57.7
Cafe 57.7
Bar Counter 56.2
Pub 56.2
Person 45.5

Imagga
created on 2022-01-22

shop 42
newspaper 41.9
product 33.1
mercantile establishment 28
stall 25.6
creation 24.9
people 21.7
man 20.8
business 18.8
counter 18.8
place of business 18.4
restaurant 18.3
bakery 17.6
person 17.3
building 14.6
city 14.1
smiling 13.7
transportation 13.4
cafeteria 13.2
adult 12.4
work 11.8
lifestyle 11.6
interior 11.5
male 11.3
barbershop 10.9
house 10.9
holding 10.7
travel 10.6
indoors 10.5
customer 10.5
store 10.4
men 10.3
finance 10.1
happy 10
color 10
room 10
daily 9.6
urban 9.6
architecture 9.4
smile 9.3
transport 9.1
old 9.1
design 9
one 9
establishment 8.8
working 8.8
home 8.8
women 8.7
30s 8.7
supermarket 8.4
shoe shop 8.4
shopping 8.3
worker 8
office 8
businessman 7.9
mall 7.8
table 7.8
chair 7.8
glass 7.8
buying 7.7
modern 7.7
money 7.6
casual 7.6
retail 7.6
horizontal 7.5
service 7.4
20s 7.3
cheerful 7.3
decoration 7.2
looking 7.2
portrait 7.1

Google
created on 2022-01-22

Photograph 94.2
Black 89.8
Coat 86.5
Black-and-white 86.2
Style 83.9
Eyewear 80
Monochrome 78.8
Monochrome photography 78.2
Suit 77.2
T-shirt 75.1
Snapshot 74.3
Hat 74.1
Shelf 69.9
Font 69.8
Customer 66
Shelving 64
Stock photography 61.9
Room 60.3
Table 59.5
Retail 57.6

Microsoft
created on 2022-01-22

text 100
person 95.4
clothing 95.3
man 91.8
table 85.7
human face 73.1
bottle 56.2
posing 40.7

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 96.4%
Calm 45.5%
Surprised 20.4%
Sad 13.9%
Angry 6.7%
Confused 5.1%
Happy 4.2%
Disgusted 2.3%
Fear 2%

AWS Rekognition

Age 26-36
Gender Female, 89%
Calm 97%
Sad 1.7%
Angry 0.9%
Disgusted 0.1%
Fear 0.1%
Confused 0.1%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 49-57
Gender Male, 99.4%
Calm 92.4%
Sad 5%
Confused 0.9%
Disgusted 0.5%
Surprised 0.4%
Happy 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Calm 70.9%
Sad 19.7%
Surprised 3.1%
Disgusted 2.4%
Happy 1.3%
Confused 1.1%
Angry 1.1%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a photo 79.7%
a group of people posing for the camera 79.6%
a group of people posing for a picture 79.5%

Text analysis

Amazon

16438.
REAL
15438
REAL PHOTOGRAPHS
PHOTOGRAPHS
plus
SIDEL

Google

15438. REAL PROTOCAN 17reet- 16438.
PROTOCAN
REAL
15438.
17reet-
16438.