Human Generated Data

Title

Three Generations

Date

20th century

People

Artist: José Clemente Orozco, Mexican 1883 - 1949

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, M11009

Copyright

© José Clemente Orozco / Artists Rights Society (ARS), New York / SOMAAP, Mexico

Human Generated Data

Title

Three Generations

People

Artist: José Clemente Orozco, Mexican 1883 - 1949

Date

20th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, M11009

Copyright

© José Clemente Orozco / Artists Rights Society (ARS), New York / SOMAAP, Mexico

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Art 100
Drawing 99.9
Adult 98.5
Female 98.5
Person 98.5
Woman 98.5
Face 97.9
Head 97.9
Adult 97.2
Female 97.2
Person 97.2
Woman 97.2
Photography 58
Portrait 58
Painting 57.4

Clarifai
created on 2023-11-01

people 99.6
portrait 99.1
art 97.4
adult 95.7
print 95.3
illustration 95.2
two 95.2
engraving 92.6
one 92.5
old 91.9
affection 91.8
woman 91.4
vintage 91
painting 90.5
antique 89.6
book bindings 88
man 82.3
book 78.6
paper 76.8
monochrome 76.4

Imagga
created on 2018-12-20

sketch 100
drawing 79.7
representation 70.5
money 31.5
cash 26.5
dollar 25.1
currency 24.2
wealth 21.5
portrait 20.7
paper 20.4
banking 20.2
bank 19.8
hundred 18.4
dollars 18.3
franklin 17.7
business 17.6
bill 17.1
face 17
finance 16.9
savings 16.8
close 16.6
one 16.4
finances 13.5
pay 13.4
financial 13.4
attractive 13.3
hair 12.7
us 12.5
loan 12.5
adult 12.3
people 12.3
closeup 12.1
rich 12.1
fashion 12.1
sexy 12
culture 12
person 11.9
bills 11.7
black 11.5
pretty 11.2
model 10.9
banknotes 10.8
man 10.8
sculpture 10.1
sensual 10
art 9.6
brunette 9.6
exchange 9.5
ancient 9.5
statue 9.5
investment 9.2
sign 9
architecture 8.8
states 8.7
debt 8.7
payment 8.7
human 8.2
sensuality 8.2
religion 8.1
market 8
body 8
wages 7.8
banknote 7.8
holy 7.7
old 7.7
head 7.6
elegance 7.6
number 7.5
economy 7.4
makeup 7.3
success 7.2
cute 7.2
history 7.2
women 7.1
male 7.1

Google
created on 2018-12-20

Microsoft
created on 2018-12-20

text 99.9
book 98.1
black and white 98.1
person 94.4
girl 76.2
portrait 43.3
art 29.3
drawing 28.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-24
Gender Female, 96.5%
Calm 23.9%
Sad 20.1%
Angry 18.2%
Surprised 17.9%
Disgusted 14.7%
Fear 9.7%
Confused 3.2%
Happy 1.1%

AWS Rekognition

Age 31-41
Gender Female, 99.8%
Calm 61%
Angry 25.4%
Fear 10.1%
Surprised 6.5%
Sad 3.1%
Disgusted 1.1%
Confused 0.3%
Happy 0.3%

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.5%
Female 98.5%
Person 98.5%
Woman 98.5%

Categories

Imagga

paintings art 99.7%

Captions

Microsoft
created on 2018-12-20

a person and a book 49.4%
a person looking at a book 49.3%
a person holding a book 49.2%

Text analysis

Amazon

Ouxceo
the Comente Ouxceo
the
Comente