Human Generated Data

Title

Untitled (old woman and middle-aged woman seated in couch wearing dresses and corsages)

Date

1940

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9013

Human Generated Data

Title

Untitled (old woman and middle-aged woman seated in couch wearing dresses and corsages)

People

Artist: Martin Schweig, American 20th century

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9013

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 98.7
Apparel 98.7
Human 97.6
Person 97.4
Plant 95.3
Person 94.4
Flower 90.1
Blossom 90.1
Flower Bouquet 88.7
Flower Arrangement 88.7
Female 86.4
Robe 84.8
Fashion 84.8
Gown 84.6
Woman 73.1
Wedding 72
Face 70
Wedding Gown 67.2
Portrait 65.5
Photography 65.5
Photo 65.5
Art 65.3
Evening Dress 63.1
Monitor 62.8
Electronics 62.8
Screen 62.8
Display 62.8
LCD Screen 62.7

Clarifai
created on 2023-10-26

people 99.8
group 97.1
wedding 96.9
monochrome 96.8
adult 95.7
woman 95.1
man 93.5
sit 93.3
furniture 90.7
leader 90.6
veil 90.6
portrait 88.2
seat 87.3
bride 86.9
singer 86.8
two 86.7
music 85.6
groom 84.6
chair 83.7
three 83.2

Imagga
created on 2022-01-23

black 18.1
art 15.6
design 15.2
person 15
man 14.8
graphic 14.6
people 12.8
human 12.7
light 12.7
dark 12.5
monitor 11.8
artistic 11.3
body 11.2
grunge 11.1
fantasy 10.8
silhouette 10.7
male 10.6
digital 10.5
motion 10.3
blackboard 10
drawing 10
negative 9.9
technology 9.6
pattern 9.6
old 9
texture 9
style 8.9
adult 8.8
symbol 8.7
render 8.6
line 8.6
film 8.4
night 8
work 7.8
space 7.7
elegance 7.5
background 7.5
decoration 7.5
backdrop 7.4
shape 7.3
dress 7.2
television 7.2
science 7.1
stage 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
drawing 77.9
sketch 55

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 75.5%
Sad 58.9%
Surprised 21.4%
Happy 10.8%
Angry 3.5%
Disgusted 1.5%
Confused 1.5%
Calm 1.2%
Fear 1.1%

AWS Rekognition

Age 40-48
Gender Male, 92.8%
Calm 75.2%
Happy 12.6%
Sad 7.7%
Surprised 2.9%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.4%

Captions

Microsoft
created on 2022-01-23

an old photo of a person 64.4%
an old photo of a person 60.9%
old photo of a person 58.8%