Human Generated Data

Title

The Dubourg Family

Date

c. 1878

People

Artist: Ignace-Henri-Jean-Théodore Fantin-Latour, French 1836 - 1904

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.820.A

Human Generated Data

Title

The Dubourg Family

People

Artist: Ignace-Henri-Jean-Théodore Fantin-Latour, French 1836 - 1904

Date

c. 1878

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Human 99.1
Person 99.1
Person 98.5
Painting 97.7
Art 97.7
Person 96.9
Person 92
People 82.8
Drawing 81.2
Portrait 63.5
Photography 63.5
Photo 63.5
Face 63.5
Sketch 57.9

Clarifai
created on 2020-04-25

people 100
art 99.4
portrait 99.4
group 99.1
wear 98.8
adult 98.6
two 98.1
family 98.1
man 98.1
print 97.5
three 97.4
woman 97.1
sepia pigment 97
offspring 96.5
furniture 96
vintage 95.8
sepia 94.8
painting 94.8
son 94.1
engraving 93.8

Imagga
created on 2020-04-25

kin 44.3
statue 38.6
sculpture 34.1
art 25.8
ancient 25.1
history 24.2
sketch 21.6
religion 20.6
architecture 20.5
drawing 17.5
old 17.4
stone 17.3
culture 17.1
religious 16.9
monument 16.8
ruler 16.7
antique 16.2
historical 16
historic 15.6
money 15.3
marble 14.8
representation 14.8
catholic 14.6
column 14.3
travel 14.1
dollar 13.9
detail 13.7
face 13.5
god 13.4
vintage 13.2
figure 13
church 13
cash 12.8
currency 12.6
paper 12.5
carving 12.3
building 11.9
city 11.6
tourism 11.6
symbol 11.5
portrait 11
closeup 10.8
roman 10.7
dollars 10.6
holy 10.6
one 10.5
decoration 10.4
famous 10.2
museum 10.2
finance 10.1
cemetery 10
bank 9.9
temple 9.8
carved 9.8
bill 9.5
savings 9.3
banking 9.2
close 9.1
financial 8.9
saint 8.7
memorial 8.7
spirituality 8.6
business 8.5
people 8.4
tourist 8.2
wealth 8.1
man 8.1
pray 7.8
cathedral 7.7
loan 7.7
outdoor 7.6
daily 7.6
exchange 7.6
head 7.6
design 7.5
rich 7.4
world 7.3
mother 7.3
landmark 7.2

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

text 100
book 99.9
clothing 96.5
sketch 96.2
drawing 96
person 94.7
human face 89.2
woman 86.7
old 86.2
painting 77.3
black 66.5
posing 61.1
vintage 31.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-52
Gender Female, 54.3%
Angry 1.1%
Disgusted 0%
Calm 12.1%
Surprised 0%
Sad 84.8%
Fear 1.6%
Happy 0%
Confused 0.3%

AWS Rekognition

Age 24-38
Gender Female, 80.3%
Disgusted 0.1%
Happy 0.1%
Calm 2.6%
Fear 0.5%
Confused 0.2%
Sad 95.7%
Surprised 0.1%
Angry 0.7%

AWS Rekognition

Age 22-34
Gender Female, 77.4%
Disgusted 0.2%
Calm 7.8%
Sad 84.7%
Fear 2.7%
Surprised 0.2%
Confused 0.7%
Happy 0%
Angry 3.7%

AWS Rekognition

Age 36-54
Gender Male, 98.9%
Calm 94.5%
Happy 0%
Sad 5.1%
Angry 0.3%
Disgusted 0%
Confused 0%
Surprised 0%
Fear 0%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 97.7%

Captions

Microsoft

a vintage photo of a person holding a book 77.8%
a vintage photo of a person 77.7%
a vintage photo of a person 77.6%

Text analysis

Amazon

AAon
2.
thouiind AAon 2. Tesla
thouiind
Tesla

Google

Fintiry tablian reeine
Fintiry
tablian
reeine