Human Generated Data

Title

Untitled (two unidentifed women, both kneeling, one applying make-up, one by table holding tea kettle and glass)

Date

1870s

People

Artist: Uyeno Hikoma, Japanese 1838 - 1904

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Special Collections, Fine Arts Library, Harvard College Library, Bequest of Evert Jansen Wendell, 2010.59

Human Generated Data

Title

Untitled (two unidentifed women, both kneeling, one applying make-up, one by table holding tea kettle and glass)

People

Artist: Uyeno Hikoma, Japanese 1838 - 1904

Date

1870s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Special Collections, Fine Arts Library, Harvard College Library, Bequest of Evert Jansen Wendell, 2010.59

Machine Generated Data

Tags

Amazon
created on 2019-11-03

Human 99.3
Person 99.3
Person 98.2
Figurine 96.2
Wood 77.8
Sitting 60.7
Art 59.9
Sculpture 57.3

Clarifai
created on 2019-11-03

people 99.4
adult 97.2
wear 95.3
painting 94.3
one 92
woman 91.3
furniture 91
group 90.1
education 90.1
man 86.7
two 86.7
portrait 84.2
music 83.9
room 82.9
seat 80.9
child 80.2
exhibition 80
indoors 78.5
art 76.9
nostalgia 75.8

Imagga
created on 2019-11-03

acoustic guitar 44.1
guitar 40.6
stringed instrument 39.8
musical instrument 33.5
person 28.4
man 24.3
male 20.6
adult 20.3
people 20.1
sitting 18.9
business 18.8
suit 18
businessman 16.8
scholar 16.5
work 14.9
chair 14.9
intellectual 14.1
room 14.1
portrait 13.6
pretty 13.3
interior 13.3
laptop 12.9
professional 12.8
home 12.8
lifestyle 12.3
office 12
indoors 11.4
computer 11.4
corporate 11.2
alone 10.9
smile 10.7
attractive 10.5
device 10.3
casual 10.2
indoor 10
house 10
executive 9.9
fashion 9.8
working 9.7
one 9.7
couch 9.7
black 9.6
looking 9.6
men 9.4
wall 9.4
happy 9.4
leisure 9.1
silhouette 9.1
handsome 8.9
worker 8.9
smiling 8.7
model 8.6
businesswoman 8.2
clothing 8.2
dress 8.1
lady 8.1
job 8
hair 7.9
human 7.5
outdoors 7.5
vintage 7.4
phone 7.4
relaxing 7.3
success 7.2
color 7.2
notebook 7.2
women 7.1
face 7.1
happiness 7

Google
created on 2019-11-03

Microsoft
created on 2019-11-03

text 99.3
wall 99.1
indoor 96.1
person 95.2
clothing 92.6
human face 85.4
old 54.8
picture frame 18.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 8-18
Gender Female, 52.4%
Angry 45%
Fear 45.1%
Disgusted 45%
Surprised 45%
Confused 45%
Sad 51.3%
Calm 48.5%
Happy 45%

AWS Rekognition

Age 18-30
Gender Female, 53.8%
Happy 45%
Disgusted 45%
Confused 45%
Surprised 45%
Sad 45%
Calm 55%
Fear 45%
Angry 45%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 12
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Text analysis

Amazon

NAGASAKI.
UYENO,
NAGASAKI. UYENO, H . LKAHK
H .
LKAHK

Google

H. UYENO, 寫野上將長本日大 NAGASAKI
上將
NAGASAKI
H.
UYENO,