Human Generated Data

Title

Untitled (portrait of five men holding string of fish)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4263

Human Generated Data

Title

Untitled (portrait of five men holding string of fish)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4263

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.5
Person 99.5
Person 99
Person 98.7
Person 98.3
Person 97.3
Face 76.3
People 71.5
Art 70.2
Clothing 63.2
Apparel 63.2
Portrait 60.8
Photo 60.8
Photography 60.8
Drawing 60.7

Clarifai
created on 2019-06-01

people 99.9
group 99.3
man 98.4
adult 96.7
woman 95.7
wear 93.5
many 90.8
group together 84.8
several 84.5
veil 83.9
medical practitioner 81.4
outfit 78.4
child 77.5
uniform 71.5
education 71.1
leader 69.3
administration 64.4
desktop 64.4
illustration 59.9
music 58.6

Imagga
created on 2019-06-01

sketch 100
drawing 95.2
representation 68.7
negative 26.9
grunge 24.7
art 24.4
film 20.7
vintage 19.8
old 19.5
design 16.3
silhouette 15.7
photographic paper 15.3
retro 14.7
graphic 14.6
antique 13
black 12.6
decorative 12.5
texture 12.5
decoration 12
style 11.9
paint 11.8
aged 11.8
pattern 10.9
architecture 10.9
photographic equipment 10.2
people 10
artistic 9.6
man 9.4
frame 9.2
urban 8.7
ancient 8.6
landscape 8.2
backgrounds 8.1
symbol 8.1
ink 7.7
culture 7.7
grungy 7.6
border 7.2
religion 7.2
history 7.2
mountain 7.1
cool 7.1
creative 7.1
paper 7.1
textured 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

window 96.2
posing 95.5
old 90.7
person 67.5
clothing 63
vintage 31.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.6%
Happy 45.9%
Confused 45.4%
Disgusted 45.6%
Sad 45.7%
Calm 51.4%
Angry 45.3%
Surprised 45.6%

AWS Rekognition

Age 20-38
Gender Male, 54.8%
Angry 45.7%
Sad 45.8%
Surprised 45.9%
Happy 46.7%
Calm 49.4%
Disgusted 45.4%
Confused 46.1%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Angry 46.7%
Sad 45.7%
Happy 45.5%
Calm 50.7%
Confused 45.4%
Disgusted 45.3%
Surprised 45.6%

AWS Rekognition

Age 26-43
Gender Male, 55%
Calm 53.6%
Surprised 45.1%
Sad 45.5%
Confused 45.1%
Disgusted 45.1%
Happy 45.2%
Angry 45.4%

AWS Rekognition

Age 20-38
Gender Male, 54.8%
Happy 46.5%
Angry 45.4%
Surprised 46%
Disgusted 45.3%
Sad 46.7%
Calm 49.3%
Confused 45.8%

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 97.4%
text visuals 2.5%

Text analysis

Amazon

mrm