Human Generated Data

Title

Untitled (negative image of man seated in elaborate chair, holding book)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6056

Human Generated Data

Title

Untitled (negative image of man seated in elaborate chair, holding book)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Advertisement 99.5
Collage 99.5
Poster 99.5
Human 95.2
Person 95.2
Mammal 88.8
Pet 88.8
Animal 88.8
Cat 88.8
Apparel 84
Clothing 84
Face 76.5
Female 72.6
Cat 62.1
Overcoat 58.7
Coat 58.7
Suit 58.7
Person 51.6

Clarifai
created on 2019-05-30

people 99.4
portrait 98.2
adult 96.1
one 95.7
man 95.4
art 90.3
wear 88.5
Halloween 88.4
illustration 85
woman 84
horror 82.4
group 82.2
child 80.7
fear 80.6
two 80.3
lid 80.3
veil 80.2
facial expression 78.3
music 78
print 75.8

Imagga
created on 2019-05-30

black 20
book jacket 15
portrait 14.9
money 14.5
currency 14.4
plastic bag 14
covering 13.8
man 13.4
bag 12.9
male 12.8
person 12.8
hair 12.7
art 12.5
negative 12.4
people 12.3
film 12.3
container 12.2
cash 11.9
jacket 11.7
screen 11.6
one 11.2
financial 10.7
pretty 10.5
business 10.3
dollar 10.2
face 9.9
bank 9.9
human 9.7
adult 9.7
sexy 9.6
finance 9.3
banking 9.2
attractive 9.1
wrapping 8.9
expression 8.5
head 8.4
savings 8.4
economy 8.3
photographic paper 8
window 8
bill 7.6
sketch 7.6
paper 7.6
vintage 7.4
style 7.4
glass 7.4
light 7.4
make 7.3
body 7.2
smile 7.1
market 7.1

Microsoft
created on 2019-05-30

drawing 99.5
sketch 99.2
text 98.5
painting 97.6
book 96
cat 92.4
black and white 83.3
human face 79.8
cartoon 77.3
clothing 68.5
person 64.7

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 99.4%
Happy 12.2%
Disgusted 11.3%
Surprised 7.3%
Confused 14.8%
Sad 8%
Angry 23%
Calm 23.4%

AWS Rekognition

Age 4-9
Gender Female, 71.8%
Angry 3.8%
Calm 89.3%
Confused 2.6%
Disgusted 0.9%
Happy 0.3%
Sad 1.5%
Surprised 1.6%

AWS Rekognition

Age 38-59
Gender Male, 95.2%
Angry 4.2%
Calm 31.6%
Sad 49.1%
Disgusted 1.8%
Surprised 2.6%
Happy 4.7%
Confused 5.9%

Feature analysis

Amazon

Person 95.2%
Cat 88.8%

Captions

Microsoft

a cat looking at a book 49.7%
a black and white photo of a cat 49.6%
a cat that is looking at a book 43.6%