Human Generated Data

Title

Untitled (Lady Filmer seated in center, to her right, Miss Murray, standing behind Lady Filmer, Lady Alfred Murray, seated to Lady Filmer's left, Lord Tyrone)

Date

1862-1888

People

Artist: Mary Georgiana Caroline Cecil Filmer, British 1838-1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.359.44

Human Generated Data

Title

Untitled (Lady Filmer seated in center, to her right, Miss Murray, standing behind Lady Filmer, Lady Alfred Murray, seated to Lady Filmer's left, Lord Tyrone)

People

Artist: Mary Georgiana Caroline Cecil Filmer, British 1838-1903

Date

1862-1888

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.359.44

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Human 98.3
Person 98.3
Person 96.8
Painting 94.6
Art 94.6
Person 94.3
Person 90.9
Helmet 83.5
Apparel 83.5
Clothing 83.5
Outdoors 65.2
Drawing 63.4
People 61.2
Photography 57.5
Photo 57.5
Nature 56.7
Funeral 56.3

Clarifai
created on 2018-04-19

people 99.9
adult 99.4
two 98.1
man 97.8
group 97.6
print 96
wear 94.9
woman 94.4
one 92.1
child 91.2
art 90.2
furniture 88.9
uniform 88.1
reclining 87.5
military 87
portrait 85.5
sit 85.1
veil 85.1
interaction 85
music 83.8

Imagga
created on 2018-04-19

cathode-ray tube 40.6
tray 33.7
gas-discharge tube 32.4
container 30.2
receptacle 27.3
money 27.2
currency 26
vessel 25
tube 23.7
dollar 22.2
finance 21.1
cash 21
business 18.2
close 17.7
paper 17.3
electronic device 16.2
banking 15.6
hundred 15.5
bathtub 15.3
bank 15.2
us 14.4
bill 14.2
financial 14.2
one 13.4
dollars 12.5
wealth 11.7
device 11.5
exchange 11.4
savings 11.2
object 11
franklin 10.8
glass 10.5
tub 10.4
rich 10.2
decoration 10.2
closeup 10.1
vintage 9.9
old 9.7
art 9.2
gold 9
market 8.9
detail 8.8
symbol 8.7
bills 8.7
pay 8.6
note 8.3
religion 8.1
celebration 8
interior 7.9
funds 7.8
banknotes 7.8
golden 7.7
states 7.7
loan 7.7
notes 7.7

Google
created on 2018-04-19

Microsoft
created on 2018-04-19

text 98.1
book 97.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-44
Gender Male, 99.3%
Angry 10.4%
Happy 0.3%
Confused 9.8%
Calm 63.4%
Sad 12.3%
Disgusted 0.5%
Surprised 3.3%

AWS Rekognition

Age 30-47
Gender Male, 54.7%
Happy 45.1%
Angry 45.3%
Confused 45.9%
Calm 53.1%
Surprised 45.4%
Disgusted 45.1%
Sad 45.2%

AWS Rekognition

Age 26-43
Gender Female, 53.4%
Surprised 45.5%
Confused 45.7%
Disgusted 46.5%
Angry 46.8%
Sad 47.5%
Calm 47.5%
Happy 45.4%

AWS Rekognition

Age 26-43
Gender Female, 50%
Calm 51%
Angry 46.3%
Surprised 45.3%
Disgusted 45.6%
Confused 45.5%
Sad 45.9%
Happy 45.5%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 46
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Painting 94.6%
Helmet 83.5%

Categories

Imagga

Text analysis

Amazon

M
hl
Sfinue
htucras
Flaur
uu hole Flaur bouts t htucras hl Sfinue
hole
bouts
t
uu