Human Generated Data

Title

Untitled (men and women standing near fireplace with drinks and food, Jos. WHarton Estate (Lippincott))

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5160

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women standing near fireplace with drinks and food, Jos. WHarton Estate (Lippincott))

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5160

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.8
Human 99.8
Person 99.6
Person 99.4
Person 99.3
Person 98.7
Person 98.6
Person 97.2
Chair 96.3
Furniture 96.3
Clinic 96
Person 95.6
Person 94
Person 93.7
Person 93.3
Person 93
Person 87.8
Indoors 79.3
Doctor 79.2
People 79
Room 77.7
Hospital 76.8
Operating Theatre 62.8
Interior Design 58.5
Nurse 57
Person 45.7

Clarifai
created on 2023-10-26

people 99.8
group 98.4
man 98.1
adult 97.7
many 97.1
woman 95.8
education 93
leader 91.3
group together 90.2
monochrome 88.4
war 87.8
administration 87.8
child 81.3
wear 80.8
school 80
sit 76
sitting 75.9
teacher 75.4
crowd 75
military 73.3

Imagga
created on 2022-01-23

newspaper 43.4
product 34.4
people 34
creation 27.9
person 26.4
business 25.5
male 24.8
businessman 24.7
nurse 23.9
man 22.8
adult 20.9
team 19.7
daily 19.3
group 18.5
happy 18.2
men 18
women 17.4
work 16.6
couple 16.5
worker 16.1
cheerful 15.4
professional 15.2
businesswoman 14.5
home 14.3
businesspeople 14.2
smiling 13.7
office 13.6
colleagues 13.6
teamwork 13
portrait 12.9
medical 12.3
indoors 12.3
meeting 12.2
job 11.5
room 11.1
together 10.5
human 10.5
technology 10.4
doctor 10.3
friends 10.3
senior 10.3
corporate 10.3
manager 10.2
happiness 10.2
two 10.2
family 9.8
health 9.7
talking 9.5
sitting 9.4
day 9.4
friendship 9.4
mature 9.3
20s 9.2
old 9
hospital 8.9
associates 8.8
success 8.8
medicine 8.8
60s 8.8
cooperation 8.7
desk 8.5
successful 8.2
patient 8.2
new 8.1
computer 8
working 7.9
clinic 7.9
smile 7.8
casual clothing 7.8
negative 7.8
table 7.8
40s 7.8
lab 7.8
laboratory 7.7
finance 7.6
hand 7.6
enjoying 7.6
life 7.5
aged 7.2
slick 7.2
mother 7.2
modern 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.9
table 96.3
furniture 79.9
woman 76.3
clothing 73.7
chair 64.2
person 62.1
wedding dress 56.9
man 55.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 72.7%
Calm 29.9%
Happy 28.8%
Sad 22.4%
Angry 6.1%
Confused 5.6%
Surprised 3.2%
Disgusted 2.5%
Fear 1.5%

AWS Rekognition

Age 25-35
Gender Female, 76.7%
Calm 57.5%
Sad 38.4%
Confused 1.2%
Surprised 0.8%
Fear 0.7%
Happy 0.6%
Angry 0.6%
Disgusted 0.2%

AWS Rekognition

Age 48-56
Gender Male, 99.8%
Calm 100%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 45-51
Gender Male, 98.5%
Sad 51.7%
Confused 27.3%
Calm 16.8%
Happy 1.7%
Disgusted 0.9%
Angry 0.7%
Surprised 0.5%
Fear 0.4%

AWS Rekognition

Age 42-50
Gender Female, 56.8%
Calm 93.5%
Sad 3.7%
Fear 0.9%
Confused 0.7%
Angry 0.5%
Happy 0.3%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 45-51
Gender Female, 85%
Calm 97%
Sad 2.1%
Happy 0.3%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 23-33
Gender Female, 61.4%
Calm 81.5%
Sad 14.3%
Happy 2%
Surprised 0.7%
Angry 0.7%
Disgusted 0.3%
Fear 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.8%
Chair 96.3%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2022-01-23

graphical user interface 35.2%

Text analysis

Amazon

13705

Google

1370 13705. HAC YT33A2-MAMTZA
1370
13705.
HAC
YT33A2-MAMTZA