Human Generated Data

Title

Untitled (two photographs: photograph of a framed studio portrait of a bowling team; photograph of a studio portrait of nun holding bible)

Date

c. 1905-1915, printed c. 1970

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5984

Human Generated Data

Title

Untitled (two photographs: photograph of a framed studio portrait of a bowling team; photograph of a studio portrait of nun holding bible)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5984

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.6
Human 99.6
Person 98.6
Person 97.9
Person 89.1
Person 83.8
Person 81.5
Art 75.2
Shop 73.4
Clothing 70
Apparel 70
Text 59.2
People 58.8
Priest 56.1
Window Display 55.1
Person 51.1

Clarifai
created on 2019-11-16

people 99.9
adult 97.7
wear 96.1
woman 96.1
man 95.9
leader 95.7
outfit 94.1
group 94.1
administration 92.8
portrait 91.5
one 91.2
veil 86.7
two 86.7
child 86.1
furniture 84.4
gown (clothing) 80.2
art 80
music 78.8
group together 78.7
indoors 77.6

Imagga
created on 2019-11-16

window 36.7
blackboard 21.8
case 21.4
framework 20.1
old 18.8
black 18.1
vintage 16.5
wall 16.4
art 15.8
supporting structure 13.7
culture 12.8
religion 12.5
shop 12.2
antique 12.1
architecture 11.9
frame 11.8
house 11.7
history 11.6
symbol 11.4
people 11.1
screen 11
male 10.6
office 10.6
texture 10.4
ancient 10.4
barbershop 10.3
room 10.2
grunge 10.2
film 10.2
protective covering 9.7
interior 9.7
one 9.7
building 9.6
door 9
home 8.8
man 8.7
light 8.7
monument 8.4
tourism 8.2
covering 8.2
paintings 7.8
glass 7.8
travel 7.7
painted 7.6
post 7.6
fireplace 7.6
decoration 7.5
sign 7.5
structure 7.5
mercantile establishment 7.4
memorial 7.4
letter 7.3
historic 7.3
design 7.3
business 7.3
paint 7.2
detail 7.2
sculpture 7.1
wooden 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 95.3
person 94.9
black 92.1
window 88.6
man 88.4
text 87.3
black and white 87
white 69.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 53.6%
Happy 45%
Disgusted 45%
Confused 45.1%
Calm 54.2%
Fear 45%
Sad 45.6%
Angry 45.1%
Surprised 45%

AWS Rekognition

Age 12-22
Gender Male, 52.6%
Happy 45%
Confused 45%
Calm 54.9%
Angry 45%
Disgusted 45%
Surprised 45%
Fear 45%
Sad 45.1%

AWS Rekognition

Age 20-32
Gender Male, 54.5%
Angry 45%
Fear 45%
Calm 54.8%
Sad 45.1%
Disgusted 45%
Happy 45%
Confused 45%
Surprised 45%

AWS Rekognition

Age 22-34
Gender Male, 51.3%
Calm 53%
Angry 45.2%
Disgusted 45%
Happy 45%
Sad 46.5%
Confused 45.2%
Fear 45%
Surprised 45%

AWS Rekognition

Age 20-32
Gender Male, 53.7%
Fear 45%
Angry 45%
Calm 55%
Surprised 45%
Happy 45%
Confused 45%
Sad 45%
Disgusted 45%

AWS Rekognition

Age 28-44
Gender Male, 51.3%
Surprised 45%
Fear 45%
Happy 45%
Sad 45.9%
Calm 54%
Disgusted 45%
Angry 45%
Confused 45%

Microsoft Cognitive Services

Age 25
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

y