Human Generated Data

Title

The Groom

Date

c. 1950

People

Artist: Federico Castellòn, American 1914 - 1971

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Donna McLaughlin-Wyant and Jeffrey Wyant, M25124

Human Generated Data

Title

The Groom

People

Artist: Federico Castellòn, American 1914 - 1971

Date

c. 1950

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Donna McLaughlin-Wyant and Jeffrey Wyant, M25124

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Human 99.1
Person 99.1
Person 96.8
Art 95.8
Collage 82.3
Advertisement 82.3
Poster 82.3
Person 79.4
Painting 75.4
Text 72.1
Drawing 63.4
Face 55.9

Clarifai
created on 2019-10-30

people 100
adult 99.4
two 99.2
group 99.2
three 98
veil 97.8
man 96.7
woman 96.7
art 95.9
print 95.8
one 95.8
portrait 95.7
wear 95.5
painting 94.8
illustration 94.4
child 94.3
lid 94.2
four 93.5
furniture 93
several 86.8

Imagga
created on 2019-10-30

sculpture 44.9
carving 36
column 31.8
art 31.4
religion 25.1
ancient 25.1
architecture 24.3
stone 23.5
statue 23.5
culture 23.1
currency 20.7
history 20.6
money 20.4
cash 20.2
monument 19.6
old 19.5
structure 18.9
temple 18.9
dollar 16.7
god 15.3
bank 15.2
face 14.9
travel 14.8
finance 14.4
famous 14
altar 13.7
one 13.5
spirituality 13.5
bill 13.3
figure 13.1
banking 12.9
historic 12.8
cemetery 12.8
carved 12.7
close 12.6
paper 12.6
pay 12.5
ruler 12.3
historical 12.2
religious 12.2
business 12.2
church 12
financial 11.6
detail 11.3
building 11.2
savings 11.2
museum 11.2
rich 11.2
landmark 10.8
wealth 10.8
marble 10.8
dollars 10.6
loan 10.6
plastic art 10.4
symbol 10.1
antique 10.1
city 10
vintage 9.9
tourism 9.9
decoration 9.7
banknote 9.7
heritage 9.7
holy 9.6
exchange 9.6
traditional 9.2
portrait 9.1
banknotes 8.8
hundred 8.7
architectural 8.7
spiritual 8.6
golden 8.6
capital 8.5
head 8.4
economy 8.4
chest 8.3
facade 8.2
support 8.2
closeup 8.1
carvings 7.9
statues 7.9
cathedral 7.8
memorial 7.7
us 7.7
profit 7.7
details 7.6
sketch 7.5
investment 7.3
painter 7.2
relief 7

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

text 97.7
human face 97.7
person 95.5
clothing 93.1
drawing 73.6
painting 65.6
picture frame 63.3
old 56.4
posing 48.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-38
Gender Male, 94.2%
Surprised 0.1%
Disgusted 0.3%
Angry 0.6%
Fear 0.1%
Calm 91.8%
Happy 0.1%
Sad 6.9%
Confused 0.1%

AWS Rekognition

Age 32-48
Gender Female, 50.2%
Disgusted 0%
Fear 0.1%
Sad 99.3%
Happy 0%
Angry 0.1%
Calm 0.4%
Confused 0%
Surprised 0%

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 75.4%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2019-10-30

an old photo of a person 80.6%
old photo of a person 78.7%
an old photo of a person 76.2%

Text analysis

Google

Cutrbe
Cutrbe