Human Generated Data

Title

Untitled (negative of 25 multiple images of a man's portrait)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3702

Human Generated Data

Title

Untitled (negative of 25 multiple images of a man's portrait)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 88.8
Person 88.8
Person 85.5
Person 85.3
Person 85
Face 83.7
Person 83.6
Person 83.4
Person 81.6
Person 81.2
Person 80.6
Person 79.8
Person 79.8
Person 78.3
Person 76.5
Text 76.5
Collage 76.2
Poster 76.2
Advertisement 76.2
Person 75.1
Person 73.8
Person 72.7
Person 72.4
Person 71
Person 70.9
Person 68.3
Person 68
Person 67.4
Person 66.9
Person 64.4
Photography 63.4
Portrait 63.4
Photo 63.4
Person 61.3
Drawing 59.3
Art 59.3
Female 55.1

Clarifai
created on 2019-06-01

set 93.6
image 92.5
symbol 92.3
collection 91.7
dollar 91.5
food 89.9
desktop 86.6
design 84.1
disjunct 81.7
health 80.7
isolated 79.8
pattern 79.6
vector 78.5
no person 78
refreshment 77.6
medicine 77
drink 75.1
decoration 74.7
clear 73.4
lock 73.3

Imagga
created on 2019-06-01

pattern 38.9
wallpaper 35.2
texture 33.3
design 32.1
backdrop 28
graphic 25.5
seamless 24.7
art 24.3
shape 22.3
decoration 15.8
retro 15.5
modern 15.4
clean 15
color 15
surface 15
digital 14.6
light 13.3
decorative 13.3
cool 13.3
bubble 13.2
textured 13.1
ornament 12.9
reflection 12.8
black 12.6
paper 12.4
element 12.4
3d 12.4
liquid 12.3
glass 12.1
circle 12
sign 12
water 12
futuristic 11.7
transparent 11.6
tile 11.4
drops 11.3
rain 11.3
technology 11.1
shiny 11.1
colorful 10.7
wet 10.7
backgrounds 10.5
aqua 10.5
close 10.3
web 10.2
symbol 10.1
crystal 10.1
ornate 10
business 9.7
style 9.6
curve 9.6
clear 9.6
bubbles 9.5
floral 9.3
elegance 9.2
dark 9.2
container 9
creative 8.8
computer 8.8
droplet 8.7
circles 8.6
trendy 8.4
communication 8.4
fractal 8.3
effect 8.2
drop 8.1
science 8
decor 7.9
vintage 7.8
industry 7.7
old 7.7
set 7.6
tech 7.6
geometric 7.6
flowing 7.5
silhouette 7.4
protective covering 7.4
environment 7.4
flow 7.4
shine 7.3
smooth 7.3
square 7.2
star 7.2
shower curtain 7.2
fresh 7.2
bright 7.1
thimble 7.1
icon 7.1

Google
created on 2019-06-01

Font 68.6

Microsoft
created on 2019-06-01

cartoon 84.5
illustration 83.5
human face 82.9
art 73.6
black and white 68.4
several 12.6

Face analysis

Amazon

AWS Rekognition

Age 16-27
Gender Female, 51.4%
Angry 45.1%
Surprised 45.1%
Calm 54.2%
Sad 45.4%
Confused 45.3%
Disgusted 45%
Happy 45%

AWS Rekognition

Age 20-38
Gender Female, 53.8%
Sad 49%
Happy 45.1%
Surprised 46.1%
Calm 47.5%
Disgusted 45.5%
Confused 45.6%
Angry 46.3%

AWS Rekognition

Age 20-38
Gender Female, 54.5%
Happy 46.3%
Confused 45.6%
Disgusted 45.8%
Sad 46.2%
Calm 50.1%
Angry 45.4%
Surprised 45.6%

AWS Rekognition

Age 26-44
Gender Female, 54.2%
Disgusted 45.5%
Sad 49%
Surprised 45.3%
Happy 45.8%
Angry 45.6%
Calm 48.1%
Confused 45.6%

AWS Rekognition

Age 6-13
Gender Female, 52.8%
Sad 46.9%
Surprised 45.3%
Disgusted 45.5%
Angry 45.6%
Calm 50.1%
Happy 45.6%
Confused 45.9%

AWS Rekognition

Age 35-52
Gender Female, 54.1%
Disgusted 45.2%
Calm 49.6%
Confused 45.2%
Sad 47.8%
Surprised 45.3%
Angry 45.7%
Happy 46.1%

AWS Rekognition

Age 38-57
Gender Female, 51.6%
Happy 45.6%
Confused 45.5%
Disgusted 45.2%
Sad 48%
Calm 50.3%
Angry 45.3%
Surprised 45.2%

AWS Rekognition

Age 17-27
Gender Female, 54.9%
Angry 45.6%
Sad 45.8%
Happy 48.1%
Confused 45.4%
Surprised 45.5%
Calm 48.9%
Disgusted 45.7%

AWS Rekognition

Age 35-52
Gender Female, 53.7%
Happy 45.3%
Disgusted 45.4%
Angry 45.5%
Calm 51.5%
Surprised 45.6%
Confused 45.5%
Sad 46.2%

AWS Rekognition

Age 26-43
Gender Female, 54.6%
Happy 45.8%
Calm 49.8%
Angry 46.3%
Surprised 46%
Disgusted 45.4%
Confused 45.6%
Sad 46.2%

Feature analysis

Amazon

Person 88.8%

Captions

Microsoft

an old photo of a person 42%
old photo of a person 39.9%
a close up of a machine 34.9%

Text analysis

Amazon

BRRR
RRRRR
BR
RRR