Human Generated Data

Title

Untitled (people seated in a room with a large fireplace, Fountain House Inn, D

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11929

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people seated in a room with a large fireplace, Fountain House Inn, D

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11929

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.1
Person 98.8
Person 98
Person 97.2
Person 93
Person 83.2
Person 83
Restaurant 82.3
Person 81.8
Person 80.1
Meal 79.2
Food 79.2
Cafeteria 69.5
Clinic 67.7
Airplane 65.5
Transportation 65.5
Vehicle 65.5
Aircraft 65.5
Person 65.1
Person 64.3
Indoors 63.6
Room 63.2
Cafe 58.9
Leisure Activities 57.2
Person 51.3

Clarifai
created on 2023-10-25

people 99.9
group 98.8
many 98.6
group together 98.5
vehicle 97.4
adult 97.4
man 94.5
transportation system 94.4
music 91
furniture 90.9
street 90
several 89.9
monochrome 89.4
woman 88.3
aircraft 86.8
room 86.4
military 86
indoors 85.6
crowd 85
recreation 83.5

Imagga
created on 2022-01-15

passenger 23.2
industry 23
industrial 22.7
factory 22.2
building 18.2
transportation 17
power 16.8
business 16.4
urban 15.7
travel 15.5
people 15
station 14.8
steel 14.1
city 14.1
structure 14.1
metal 13.7
work 13.3
pipe 12.6
engineering 12.4
inside 12
architecture 11.8
manufacturing 11.7
crowd 10.6
modern 10.5
technology 10.4
life 10.4
ship 10.3
equipment 10
interior 9.7
sky 9.6
machine 9.5
journey 9.4
water 9.3
energy 9.2
plaza 9.2
plant 9
pipes 8.9
mechanical 8.7
gas 8.7
concrete 8.6
shop 8.5
device 8.2
world 8
stage 7.9
machinery 7.8
waste 7.8
technical 7.7
military 7.7
tube 7.7
case 7.7
pollution 7.7
oil 7.4
man 7.4
vacation 7.4
counter 7.3
tank 7.3
time 7.3
transport 7.3
protection 7.3
train 7.1
war 7.1
hall 7.1
carousel 7
valve 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

black and white 93.1
text 92.7
person 92.5
indoor 92.3
clothing 68.3
man 57.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.5%
Disgusted 92.4%
Calm 3.7%
Sad 1%
Fear 0.9%
Happy 0.9%
Confused 0.6%
Surprised 0.3%
Angry 0.2%

AWS Rekognition

Age 30-40
Gender Male, 66.3%
Calm 57%
Happy 36.6%
Sad 4%
Fear 0.9%
Angry 0.6%
Confused 0.4%
Disgusted 0.3%
Surprised 0.3%

AWS Rekognition

Age 20-28
Gender Female, 52.1%
Happy 60.3%
Calm 16.8%
Disgusted 8.5%
Sad 5%
Fear 4.7%
Confused 2.1%
Angry 1.7%
Surprised 0.8%

AWS Rekognition

Age 23-31
Gender Male, 78%
Calm 89.4%
Sad 10%
Confused 0.2%
Fear 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Airplane 65.5%

Text analysis

Google

MJIHY TAT A TOA
MJIHY
TAT
A
TOA