Human Generated Data

Title

Untitled (photograph of a photograph mounted on a wall, portrait of four women)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13639

Human Generated Data

Title

Untitled (photograph of a photograph mounted on a wall, portrait of four women)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Apparel 99.9
Clothing 99.9
Person 99.2
Human 99.2
Person 98.9
Person 98.7
Person 97.7
Fashion 94.6
Gown 94.6
Robe 94.1
Dress 94
Female 92.9
Wedding 88.2
Evening Dress 83
Woman 81.8
Wedding Gown 78.2
Bride 69.5

Imagga
created on 2022-02-04

television 100
telecommunication system 73.6
broadcasting 36.1
telecommunication 26.2
man 19.5
black 18
medium 17.4
people 16.7
windowsill 16
business 15.2
businessman 15
sill 13.8
monitor 13.4
silhouette 13.2
office 12.8
male 12.8
person 12.7
computer 12.2
screen 10.7
working 10.6
art 10.4
adult 10.3
support 10.1
old 9.7
structural member 9.6
couple 9.6
portrait 9
one 8.9
building 8.7
love 8.7
bride 8.6
newspaper 8.5
back 8.3
happy 8.1
window 8
dark 7.5
alone 7.3
dirty 7.2
face 7.1

Google
created on 2022-02-04

Picture frame 95.2
Rectangle 88
Sleeve 87.1
Dress 86.8
Gesture 85.3
Art 80.3
Tints and shades 77.4
Monochrome photography 72.4
Font 70.1
Visual arts 70.1
Wood 69.1
Event 67.5
Room 67.2
Monochrome 66.5
Vintage clothing 65.9
Plant 62.7
Tree 62.4
History 62.2
Stock photography 61.8
Pattern 61.6

Microsoft
created on 2022-02-04

dress 97.7
text 93.2
wedding dress 91.2
clothing 90.5
gallery 90
room 87.4
woman 85.8
person 82.5
white 78.1
bride 71.3
scene 70.9
wedding 62.5
painting 61.8
picture frame 37.5

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 69.5%
Calm 90.5%
Happy 3.3%
Confused 1.8%
Sad 1.4%
Surprised 0.9%
Angry 0.8%
Disgusted 0.8%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Female, 99.8%
Calm 99.9%
Surprised 0%
Happy 0%
Confused 0%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Female, 57.8%
Calm 47.6%
Confused 13.6%
Surprised 11.9%
Sad 8%
Fear 6.7%
Disgusted 5.3%
Angry 5.2%
Happy 1.7%

AWS Rekognition

Age 38-46
Gender Male, 92.3%
Calm 54.5%
Sad 25.2%
Happy 5.8%
Disgusted 5.2%
Confused 4%
Fear 2.3%
Surprised 1.6%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

an old photo of a person in a room 77.3%
a person standing in front of a window 50.4%
an old photo of a person 50.3%

Text analysis

Amazon

YT37A2-XAOOX

Google

YT3RA2-XA
YT3RA2-XA