Human Generated Data

Title

Untitled (group portrait of adults and children in living room)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4996

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group portrait of adults and children in living room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4996

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.7
Human 98.7
Person 98.3
Person 97.3
Person 96.3
Person 94.9
Person 93
Person 92.9
Person 92.7
Person 91.5
Person 87.7
Clinic 86.4
Person 85.8
Person 82.8
People 73.6
Person 71.6
Person 68.8
Room 62.5
Indoors 62.5
Person 62.3
Table Lamp 58.2
Lamp 58.2
Hospital 55.3

Clarifai
created on 2023-10-26

people 99.7
group 98
man 97.1
woman 96.3
adult 96.1
child 96.1
education 96.1
many 94.4
indoors 93.7
sit 92.5
leader 87.7
monochrome 87.5
chair 86.2
school 85.9
audience 82.2
room 80.4
teacher 80.3
family 79.8
crowd 78.4
war 77

Imagga
created on 2022-01-22

drawing 27.7
business 24.9
design 19.7
sketch 18.8
art 16.8
symbol 15.5
set 15.3
silhouette 14.9
element 14.9
graphic 14.6
technology 14.1
icon 13.5
modern 13.3
pattern 13
representation 12.4
sign 12
black 12
construction 12
office 11.2
people 11.2
house 10.9
equipment 10.8
man 10.8
backgrounds 10.5
clip art 10.2
team 9.9
creative 9.7
industry 9.4
architecture 9.4
web 9.3
icons 9.3
wagon 9.2
plaything 9.1
person 8.9
boutique 8.9
button 8.8
decoration 8.7
gymnasium 8.6
line 8.6
menu 8.6
home 8.5
3d 8.5
grunge 8.5
bank 8.4
reflection 8.4
paper 8.3
style 8.2
facility 8.1
computer 8
interior 8
businessman 7.9
room 7.6
plan 7.6
communication 7.6
arrow 7.6
clean 7.5
city 7.5
retro 7.4
template 7.3
toy 7.3
graphics 7.3
creation 7.3
group 7.3
wheeled vehicle 7.2
idea 7.1
frame 7.1
structure 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.1
window 84.2
table 82.5
clothing 75.6
person 65.2
wedding dress 52
woman 51.2
old 50.3
posing 42.5
room 41.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 91.2%
Sad 56.5%
Happy 16.7%
Calm 13.9%
Confused 9.2%
Disgusted 1%
Surprised 1%
Angry 0.9%
Fear 0.8%

AWS Rekognition

Age 36-44
Gender Female, 50.9%
Happy 62%
Calm 27.2%
Sad 4.4%
Confused 3.4%
Disgusted 1.2%
Surprised 0.8%
Angry 0.5%
Fear 0.5%

AWS Rekognition

Age 40-48
Gender Female, 95.4%
Happy 43.4%
Sad 22.8%
Calm 18%
Confused 9.3%
Fear 2.2%
Angry 1.7%
Surprised 1.7%
Disgusted 0.9%

AWS Rekognition

Age 38-46
Gender Female, 85.3%
Calm 70.4%
Happy 19.5%
Sad 6.7%
Confused 1.1%
Surprised 0.7%
Angry 0.7%
Fear 0.6%
Disgusted 0.3%

AWS Rekognition

Age 40-48
Gender Male, 70.6%
Calm 35.1%
Happy 31.7%
Sad 12.9%
Fear 8.9%
Angry 3.7%
Surprised 2.8%
Disgusted 2.6%
Confused 2.3%

AWS Rekognition

Age 40-48
Gender Female, 78.8%
Happy 71%
Sad 18%
Calm 4.1%
Confused 2.1%
Fear 1.6%
Angry 1.5%
Surprised 0.9%
Disgusted 0.9%

AWS Rekognition

Age 26-36
Gender Male, 81.7%
Calm 43.9%
Sad 33.7%
Happy 17.1%
Confused 3.9%
Angry 0.5%
Surprised 0.4%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 43-51
Gender Male, 88%
Sad 62.8%
Confused 22.8%
Happy 6%
Calm 4.8%
Disgusted 1.5%
Fear 1.2%
Angry 0.5%
Surprised 0.4%

AWS Rekognition

Age 23-33
Gender Male, 97.9%
Sad 91.8%
Calm 3%
Happy 1.6%
Confused 1.2%
Angry 0.8%
Fear 0.7%
Surprised 0.4%
Disgusted 0.4%

AWS Rekognition

Age 49-57
Gender Female, 63.5%
Sad 64.4%
Calm 30%
Happy 2.2%
Confused 1.3%
Disgusted 0.8%
Fear 0.6%
Angry 0.5%
Surprised 0.3%

AWS Rekognition

Age 29-39
Gender Female, 71%
Sad 55.4%
Calm 32%
Confused 4.2%
Happy 4%
Disgusted 1.7%
Fear 1.2%
Angry 1%
Surprised 0.5%

AWS Rekognition

Age 27-37
Gender Female, 58.9%
Calm 57.5%
Sad 18.8%
Confused 8%
Happy 7.5%
Fear 3%
Angry 2.1%
Disgusted 2.1%
Surprised 1.1%

AWS Rekognition

Age 37-45
Gender Female, 99.2%
Sad 76.3%
Happy 19.6%
Calm 1.5%
Confused 1.1%
Disgusted 0.5%
Surprised 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Female, 82%
Happy 72.4%
Sad 13.1%
Calm 9.8%
Fear 1.4%
Surprised 1.1%
Angry 0.9%
Confused 0.9%
Disgusted 0.5%

AWS Rekognition

Age 33-41
Gender Female, 75.6%
Calm 98.6%
Sad 0.9%
Confused 0.2%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Text analysis

Amazon

11427.
ar
HAMT2AR

Google

11427. I427.
11427.
I427.