Human Generated Data

Title

Yasuo Kuniyoshi (1889-1953)

Date

c. 1950

People

Artist: Sol Libsohn, American 1914 - 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1973.88

Human Generated Data

Title

Yasuo Kuniyoshi (1889-1953)

People

Artist: Sol Libsohn, American 1914 - 2001

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Table Lamp 95.8
Lamp 95.8
Dog 66.2
Mammal 66.2
Animal 66.2
Canine 66.2
Pet 66.2
Furniture 65.8
Electronics 64.5
Pc 57.4
Computer 57.4

Imagga
created on 2022-01-22

person 32.5
people 32.4
adult 32
man 27.6
sitting 27.5
business 24.3
indoors 22.9
professional 21.7
male 21.6
portrait 20.7
lifestyle 19.5
casual 19.5
smile 19.3
desk 19
smiling 18.8
businessman 18.6
office 18.4
work 18.1
color 17.8
executive 17.2
businesspeople 17.1
happy 16.9
attractive 16.8
businesswoman 16.4
corporate 16.3
job 15.9
women 15.8
looking 15.2
hairdresser 15.2
computer 14.7
indoor 14.6
men 14.6
interior 14.2
student 14
table 13.9
teamwork 13.9
group 13.7
one 13.4
working 13.3
boy 13.1
confident 12.7
laptop 12.7
communication 12.6
happiness 12.5
holding 12.4
together 12.3
meeting 12.3
face 12.1
friendly 11.9
team 11.7
handsome 11.6
worker 11.6
sit 11.4
pretty 11.2
mature 11.2
education 10.4
stringed instrument 10.4
black 10.2
coffee 10.2
inside 10.1
20s 10.1
suit 10
colleagues 9.7
success 9.7
look 9.6
workplace 9.5
day 9.4
occupation 9.2
musical instrument 9.1
room 9.1
chair 9.1
restaurant 9
classroom 9
cheerful 8.9
couple 8.7
standing 8.7
cute 8.6
blond 8.5
clothing 8.5
two 8.5
call 8.5
study 8.4
book 8.2
alone 8.2
drumstick 8.2
technology 8.2
lady 8.1
equipment 7.9
child 7.8
conference 7.8
seated 7.8
1 7.7
class 7.7
modern 7.7
expression 7.7
confidence 7.7
microphone 7.6
talking 7.6
college 7.6
friends 7.5
leisure 7.5
positive 7.4
grand piano 7.3
notebook 7.2
school 7.2
support 7.1
shop 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 98.3
man 98.1
indoor 95.2
black and white 85.4
human face 81.5
clothing 80

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 50-58
Gender Male, 100%
Confused 81.2%
Surprised 16%
Calm 1.1%
Sad 0.6%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%
Happy 0%

Microsoft Cognitive Services

Age 53
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 66.2%

Captions

Microsoft

a man sitting in front of a laptop 78.2%
a man sitting at a desk looking at a laptop 78.1%
a man sitting down and looking at the camera 78%