Human Generated Data

Title

Untitled (men, women and dog sitting around Christmas tree)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7178

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men, women and dog sitting around Christmas tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7178

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.5
Human 99.5
Person 99.5
Person 99.1
Chair 99.1
Furniture 99.1
Person 96.8
Shoe 95
Footwear 95
Clothing 95
Apparel 95
Shoe 94.5
Plant 87.6
Tree 86.3
Indoors 84.6
Interior Design 84.6
Crowd 81
Room 76.2
Sitting 74.9
People 72.7
Living Room 64
Audience 61
Suit 59.5
Coat 59.5
Overcoat 59.5
Couch 59.4
Text 58.2
Photography 57.3
Photo 57.3
Table 55.7
Person 53.8

Clarifai
created on 2023-10-25

people 99.9
man 97.9
adult 97.8
woman 97.5
group 97.3
group together 97.2
room 96.8
monochrome 96.6
sit 96
furniture 95.1
chair 94.7
family 92.3
many 91.2
leader 89.4
indoors 89.1
home 87.2
several 85.1
three 84.4
administration 84.4
child 83.6

Imagga
created on 2022-01-08

classroom 43.4
room 43.1
man 36.3
male 36.2
people 31.8
business 31
businessman 30.9
person 27.9
office 26.9
group 25
adult 24.7
blackboard 24.6
women 22.9
teacher 22.1
men 21.5
meeting 19.8
professional 17.5
sitting 17.2
happy 16.9
modern 16.8
couple 16.5
corporate 16.3
team 16.1
smiling 15.9
laptop 15.6
job 15
manager 14.9
lifestyle 14.4
communication 14.3
interior 14.1
together 14
teamwork 13.9
table 13.8
businesswoman 13.6
desk 13.2
education 13
portrait 12.9
computer 12.8
indoor 12.8
work 12.7
school 12.7
building 12.3
indoors 12.3
success 12.1
two 11.9
chair 11.5
talking 11.4
smile 11.4
cheerful 11.4
mature 11.2
casual 11
suit 10.8
working 10.6
hall 10.3
executive 10.3
educator 10.2
black 10.2
newspaper 9.9
human 9.7
discussion 9.7
colleagues 9.7
urban 9.6
businesspeople 9.5
glass 9.3
finance 9.3
silhouette 9.1
board 9
handsome 8.9
looking 8.8
conference 8.8
teaching 8.8
boy 8.7
mid adult 8.7
30s 8.7
day 8.6
boss 8.6
world 8.5
student 8.5
friendship 8.4
city 8.3
life 8.3
holding 8.3
20s 8.2
employee 8.1
family 8
window 7.9
architecture 7.8
color 7.8
product 7.7
diversity 7.7
confidence 7.7
career 7.6
friends 7.5
technology 7.4
worker 7.4
phone 7.4
inside 7.4
percussion instrument 7.3
occupation 7.3
child 7.3
confident 7.3
copy space 7.2
idea 7.1
happiness 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.7
christmas tree 91.6
window 87.2
clothing 80.4
person 75
furniture 73.8
man 68.6
table 58.2
group 55.9
chair 55.1
house 53.7
old 53.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 95.5%
Happy 84.8%
Sad 6.7%
Calm 2.9%
Confused 2.4%
Surprised 1.1%
Disgusted 0.8%
Fear 0.8%
Angry 0.6%

AWS Rekognition

Age 49-57
Gender Female, 71%
Happy 85.5%
Sad 6.2%
Calm 3.2%
Angry 2%
Confused 0.9%
Disgusted 0.8%
Surprised 0.7%
Fear 0.7%

AWS Rekognition

Age 41-49
Gender Male, 70.8%
Sad 48.8%
Calm 46.7%
Confused 2.4%
Happy 1.3%
Angry 0.4%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Female, 97%
Happy 65.5%
Calm 13.3%
Sad 8.5%
Surprised 4.4%
Fear 2.9%
Confused 2.4%
Angry 1.7%
Disgusted 1.2%

AWS Rekognition

Age 40-48
Gender Female, 98.8%
Happy 97.5%
Sad 1.2%
Surprised 0.3%
Calm 0.3%
Fear 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 95%

Categories

Imagga

paintings art 99%

Text analysis

Amazon

43758.

Google

43758.
43758.