Human Generated Data

Title

Untitled (people seated in a room with a large fireplace, Fountain House Inn, D

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11930

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people seated in a room with a large fireplace, Fountain House Inn, D

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11930

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.3
Person 99.1
Person 97.8
Person 97.1
Person 96.9
Person 96.8
Person 93.2
Person 92
Person 91.1
Person 89.3
Person 80.8
Meal 78.6
Food 78.6
Restaurant 78.5
Cafeteria 74.1
Workshop 70.5
Person 65.7
Person 65.5
Art 63.9
Chair 62.6
Furniture 62.6
Cafe 58.2
Airplane 57.1
Transportation 57.1
Vehicle 57.1
Aircraft 57.1
Indoors 55.9
Room 55.5
Airplane 52.5

Clarifai
created on 2023-10-25

people 99.9
adult 98.6
group 98.5
monochrome 98.4
vehicle 98.2
group together 98.1
many 97.4
man 95.8
transportation system 95.6
several 95.4
woman 92.3
war 91.4
military 90.5
aircraft 89.6
watercraft 88.5
furniture 84.3
room 83.2
child 81
airplane 78
street 78

Imagga
created on 2022-01-15

industry 26.4
surgeon 23.7
industrial 22.7
shop 21.1
passenger 20.8
business 18.2
power 17.6
city 17.4
transportation 16.1
equipment 15.6
factory 15.4
metal 15.3
mercantile establishment 15.1
urban 14.8
technology 14.8
ship 14.5
station 13.7
steel 13.2
people 12.8
barbershop 12.7
interior 12.4
work 11.8
building 11.7
inside 11
architecture 10.9
energy 10.9
man 10.7
manufacturing 10.7
salon 10.7
room 10.7
military 10.6
travel 10.5
modern 10.5
tank 10.2
place of business 9.9
old 9.7
pipe 9.7
engineering 9.5
vessel 9.2
transport 9.1
war 9
battleship 8.9
center 8.7
water 8.7
adult 8.5
world 8.3
occupation 8.2
plant 8.2
vehicle 8.1
hospital 8.1
life 8.1
train 7.9
pressure 7.8
men 7.7
tube 7.7
uniform 7.7
machine 7.6
danger 7.3
activity 7.2
military vehicle 7.1
male 7.1
sky 7

Google
created on 2022-01-15

Photograph 94.3
White 92.2
Black 89.6
Black-and-white 86.7
Style 84
Monochrome photography 76.8
Chair 75.6
Monochrome 75.3
Snapshot 74.3
Event 70
Stock photography 68.4
Room 66
Art 66
Machine 63.4
History 59.4
T-shirt 54.3
Photography 51.6
Suit 51.2

Microsoft
created on 2022-01-15

person 94.4
text 91.1
black and white 85.3
clothing 80.5
man 79.1
ship 67.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 100%
Calm 95.2%
Sad 3.6%
Angry 0.4%
Confused 0.4%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 23-33
Gender Female, 64.7%
Calm 82.2%
Happy 12.7%
Angry 1.3%
Surprised 1.2%
Sad 0.9%
Disgusted 0.7%
Confused 0.7%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Female, 68.6%
Calm 71.1%
Sad 11.2%
Surprised 9.1%
Confused 3.2%
Happy 3.1%
Disgusted 0.9%
Fear 0.8%
Angry 0.6%

AWS Rekognition

Age 25-35
Gender Male, 87%
Calm 99.4%
Fear 0.3%
Angry 0.1%
Sad 0.1%
Happy 0%
Surprised 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 39-47
Gender Male, 99.3%
Calm 97.8%
Happy 1.7%
Confused 0.2%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 97.5%
Sad 95.4%
Calm 2.4%
Confused 1.6%
Disgusted 0.2%
Happy 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 14-22
Gender Female, 97%
Calm 44.8%
Sad 29.6%
Confused 10.6%
Happy 7.8%
Angry 3.1%
Fear 2.1%
Disgusted 1.1%
Surprised 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 62.6%
Airplane 57.1%

Text analysis

Amazon

MJI3
MJI3 ARDA
ARDA

Google

MJI7 YT31A2 A 13A
MJI7
YT31A2
A
13A