Human Generated Data

Title

Untitled (man and women eating on the floor)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8270

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and women eating on the floor)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8270

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99
Human 99
Person 95.9
Person 95.9
Clinic 94.2
Furniture 90.4
Person 83.5
Hospital 81.6
Operating Theatre 76.9
Table 75.5

Clarifai
created on 2023-10-25

people 99.8
group together 98.9
group 98.2
adult 96.8
wear 95.7
woman 95.1
vehicle 92.7
man 92.2
monochrome 91.7
many 90.7
administration 90.5
several 90.4
street 89.3
furniture 88.8
child 87.9
transportation system 87.6
medical practitioner 87.1
hospital 86.8
war 85.8
three 85.5

Imagga
created on 2022-01-08

man 32.9
people 29.5
person 27
shop 24.2
male 22.7
adult 22.3
indoors 21.1
chair 20.1
working 18.5
lifestyle 18.1
work 18
barbershop 16.8
job 15.9
room 15.2
worker 15.1
interior 15
seat 14.8
occupation 14.7
men 13.7
portrait 13.6
business 13.4
mercantile establishment 13.2
computer 13.2
sitting 12.9
equipment 12.7
women 12.6
barber chair 12.6
happy 12.5
smiling 12.3
hairdresser 11.9
machine 11.9
casual 11.9
professional 11.7
place of business 11.1
industry 11.1
laptop 11
table 10.9
device 10.8
salon 10.5
one 10.4
home 10.4
indoor 10
senior 9.4
office 9.4
smile 9.3
inside 9.2
fashion 9
two 8.5
mature 8.4
fun 8.2
outdoors 8.2
steel 8
businessman 7.9
couple 7.8
engine 7.7
repair 7.7
health 7.6
workshop 7.6
human 7.5
clothes 7.5
phone 7.4
back 7.3
teenager 7.3
industrial 7.3
building 7.2
furniture 7.2
patient 7.2
vehicle 7.1
hospital 7.1
travel 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 97.5
text 89.8
black and white 83.1
clothing 81.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 98.3%
Calm 97%
Sad 1.3%
Happy 1.2%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 25-35
Gender Male, 83.1%
Calm 99.1%
Sad 0.4%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Angry 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Text analysis

Amazon

4