Human Generated Data

Title

Untitled (portrait of five women, two seated)

Date

c. 1950

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12242

Human Generated Data

Title

Untitled (portrait of five women, two seated)

People

Artist: Durette Studio, American 20th century

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12242

Machine Generated Data

Tags

Amazon
created on 2023-10-25

People 99.9
Clothing 99.7
Dress 99.7
Face 99.6
Head 99.6
Photography 99.6
Portrait 99.6
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Female 99.1
Woman 99.1
Person 99.1
Person 99
Person 99
Adult 99
Male 99
Man 99
Formal Wear 95.4
Footwear 89
Shoe 89
Shoe 81.8
Shoe 75
Shoe 66.4
Furniture 64.5
Shoe 61.2
Blouse 58
Skirt 57.8
Suit 57.1
Chair 57.1
Couch 56.1
Arch 55.7
Architecture 55.7
Art 55.6
Painting 55.6
Lady 55.5

Clarifai
created on 2018-08-23

people 99.9
group 99.6
woman 98.6
group together 98
adult 96.7
man 95.9
leader 94.7
administration 94.2
child 94
several 93.6
many 93.2
family 93.1
wedding 92.7
five 92.3
four 92.1
music 89.8
home 89.7
portrait 89.4
actress 88
wear 86.5

Imagga
created on 2018-08-23

kin 60.2
people 22.3
old 21.6
architecture 20.4
city 19.9
building 19.8
world 16.6
history 16.1
tourism 14.8
arch 14.6
street 13.8
historic 13.7
religion 13.4
man 13.4
travel 13.4
window 12.9
person 12.8
historical 12.2
ancient 12.1
church 12
adult 11.6
male 11.5
inside 11
palace 11
stone 11
business 10.9
entrance 10.6
scene 10.4
culture 10.2
famous 10.2
dark 10
landmark 9.9
black 9.8
room 9.6
door 9.6
tourist 9.3
exterior 9.2
art 9.1
group 8.9
urban 8.7
cathedral 8.6
men 8.6
monument 8.4
vintage 8.3
family 8
home 8
sidewalk 7.9
women 7.9
style 7.4
child 7.1
interior 7.1
businessman 7.1

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

person 99.7
outdoor 89.1
posing 88.3
people 75
group 70.1
old 46.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-54
Gender Female, 100%
Calm 99.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 36-44
Gender Female, 100%
Happy 87.6%
Surprised 7.2%
Fear 6.2%
Disgusted 4.1%
Sad 2.7%
Calm 1.9%
Angry 1.6%
Confused 0.8%

AWS Rekognition

Age 28-38
Gender Female, 100%
Happy 91.2%
Surprised 6.8%
Fear 6%
Calm 5.5%
Sad 2.3%
Confused 0.7%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 48-54
Gender Female, 100%
Calm 98.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.5%
Disgusted 0.3%
Happy 0.2%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Female, 99.8%
Happy 42.9%
Calm 29.4%
Disgusted 20.4%
Surprised 7.7%
Fear 6.2%
Sad 2.4%
Angry 1.9%
Confused 1.1%

Microsoft Cognitive Services

Age 50
Gender Female

Microsoft Cognitive Services

Age 58
Gender Female

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 99.1%
Male 99.1%
Man 99.1%
Female 99.1%
Woman 99.1%
Shoe 89%

Text analysis

Amazon

ARDA
MJIR
MJIR STARTIN ARDA
STARTIN