Human Generated Data

Title

Untitled (middle-aged woman in striped dress seated on coffee table)

Date

1961

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11102

Human Generated Data

Title

Untitled (middle-aged woman in striped dress seated on coffee table)

People

Artist: Claseman Studio, American 20th century

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11102

Machine Generated Data

Tags

Amazon
created on 2022-03-19

Person 99.7
Human 99.7
Plant 98.3
Couch 97.2
Furniture 97.2
Person 95.9
Flower Bouquet 94.7
Flower 94.7
Flower Arrangement 94.7
Blossom 94.7
Sitting 84.5
Person 80.9
Flooring 75.8
Living Room 75.7
Indoors 75.7
Room 75.7
Cooktop 69.7
Wood 61.5
Hardwood 59.6
Interior Design 57.5
Shelf 55.5

Clarifai
created on 2025-01-10

people 99.9
one 98.9
monochrome 98.4
portrait 98.2
woman 98
adult 96.6
furniture 93.6
wear 92.9
seat 91.4
analogue 91.3
art 91.1
actress 90.8
writer 90.5
wedding 89.1
sit 89.1
two 88.1
music 88
chair 87.5
man 86.5
girl 85.5

Imagga
created on 2022-03-19

home 35.9
people 29
interior 25.6
furniture 25.2
person 25
room 23.7
happy 22.6
happiness 20.4
smiling 20.2
kitchen 20.1
domestic 18.2
bakery 18
indoors 17.6
house 17.5
shop 17.2
adult 16.8
pretty 16.1
family 16
table 15.6
couple 14.8
lifestyle 14.5
women 14.2
food 13.8
portrait 13.6
fashion 13.6
man 13.4
bride 13.4
sitting 12.9
love 12.6
mercantile establishment 12.2
cheerful 12.2
smile 12.1
wedding 12
male 11.7
furnishing 11.5
bouquet 11.5
attractive 11.2
style 11.1
mother 11
life 10.9
meal 10.9
luxury 10.3
decoration 10.3
elegance 10.1
dress 9.9
vintage 9.9
retro 9.8
sexy 9.6
cooking 9.6
standing 9.6
child 9.5
wife 9.5
two 9.3
lady 8.9
work 8.8
celebration 8.8
together 8.8
black 8.7
hair 8.7
ancient 8.6
party 8.6
window 8.6
lunch 8.6
home appliance 8.5
drink 8.4
coffee 8.3
appliance 8.3
case 8.2
blond 8.2
indoor 8.2
one 8.2
stylish 8.1
place of business 8.1
washstand 8
looking 8
cute 7.9
holiday 7.9
stove 7.9
brunette 7.8
boy 7.8
dinner 7.8
antique 7.8
sofa 7.8
gift 7.7
modern 7.7
casual 7.6
eating 7.6
groom 7.5
chair 7.5
fun 7.5
inside 7.4
teen 7.3
romantic 7.1
posing 7.1
look 7

Microsoft
created on 2022-03-19

vase 92.3
furniture 88.3
flower 83.2
table 80.8
black and white 80
person 73.1
clothing 71
woman 57.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 52-60
Gender Female, 100%
Calm 56.8%
Happy 30.5%
Confused 5%
Surprised 2.6%
Fear 1.5%
Angry 1.3%
Disgusted 1.2%
Sad 1.1%

AWS Rekognition

Age 35-43
Gender Female, 72.8%
Calm 60.8%
Confused 34.5%
Sad 2.2%
Happy 0.6%
Angry 0.5%
Surprised 0.5%
Fear 0.5%
Disgusted 0.4%

AWS Rekognition

Age 22-30
Gender Male, 92%
Happy 47.6%
Disgusted 30.1%
Sad 4.9%
Calm 4.6%
Confused 4.5%
Surprised 2.8%
Fear 2.7%
Angry 2.7%

AWS Rekognition

Age 14-22
Gender Male, 66%
Calm 82.5%
Confused 5.5%
Angry 2.8%
Sad 2.7%
Surprised 2.2%
Happy 1.9%
Disgusted 1.5%
Fear 0.9%

Microsoft Cognitive Services

Age 55
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person
Cooktop
Person 99.7%
Person 95.9%
Person 80.9%
Cooktop 69.7%

Categories

Captions

Microsoft
created on 2022-03-19

a person sitting on a table 69.9%
a person sitting on a table 63.1%
a girl sitting on a table 50.6%

Text analysis

Amazon

-NAOOX
However
سدامة