Human Generated Data

Title

Untitled (family portrait in living room)

Date

1935

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21918

Human Generated Data

Title

Untitled (family portrait in living room)

People

Artist: Hamblin Studio, American active 1930s

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21918

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 96.4
Human 96.4
Person 93.2
Person 92.2
Person 92
Person 90.4
Person 88.9
Person 88.7
Interior Design 83.1
Indoors 83.1
Room 82.8
Leisure Activities 77
People 72.4
Person 67.7
Person 67.5
Art 60.6
Toy 58.9
Window Display 58.2
Shop 58.2
Paper 56.3
Living Room 55.5
Person 45.1

Clarifai
created on 2023-10-22

people 99.6
group 98.9
adult 95.3
wear 94.8
many 94
man 93.4
woman 90.5
child 89.6
retro 88.9
desktop 88.1
art 87
military 85.7
vintage 85.5
dancing 85.2
illustration 83.9
group together 83.8
one 82.1
music 81.6
old 81.3
several 81.1

Imagga
created on 2022-03-11

grunge 33.2
old 25.8
vintage 24.8
decoration 20.3
wall 19.5
antique 19.2
aged 19
retro 18.8
texture 18.7
room 18.3
art 17.7
black 17.4
graffito 17.4
dirty 16.3
silhouette 14.9
people 14.5
toilet 14.5
dark 14.2
ancient 13.8
man 13.5
person 13.1
grungy 12.3
pattern 12.3
weathered 11.4
textured 11.4
design 11.2
style 11.1
city 10.8
graphic 10.2
material 9.8
paper 9.6
male 9.2
business 9.1
structure 9
surface 8.8
urban 8.7
light 8.7
decay 8.7
stain 8.6
concrete 8.6
architecture 8.6
kin 8.5
frame 8.3
backdrop 8.2
effect 8.2
paint 8.1
building 8.1
brown 8.1
success 8
businessman 7.9
holiday 7.9
artistic 7.8
space 7.8
men 7.7
rusty 7.6
human 7.5
backgrounds 7.3
rough 7.3
group 7.2
painting 7.2
women 7.1
world 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

drawing 96.3
text 93.5
cartoon 81.3
person 80.7
black and white 73.3
clothing 61.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 98%
Calm 99.3%
Confused 0.5%
Happy 0.1%
Surprised 0.1%
Angry 0%
Disgusted 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 90.7%
Calm 53.3%
Happy 44.9%
Sad 1%
Surprised 0.4%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Female, 84.3%
Calm 66.4%
Confused 14.2%
Happy 10.5%
Sad 3.4%
Angry 3%
Disgusted 1.8%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Female, 53.4%
Calm 99.2%
Happy 0.3%
Angry 0.2%
Surprised 0.1%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 96.2%
Calm 97.4%
Happy 2.1%
Sad 0.2%
Confused 0.1%
Disgusted 0%
Surprised 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Calm 99.7%
Sad 0.1%
Surprised 0.1%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 96.4%
Person 93.2%
Person 92.2%
Person 92%
Person 90.4%
Person 88.9%
Person 88.7%
Person 67.7%
Person 67.5%
Person 45.1%

Text analysis

Amazon

ARDA
YY33A2
MJ17 YY33A2 ARDA
MJ17