Human Generated Data

Title

Josef Albers Teaching at the Harvard Graduate School of Design

Date

1950

People

Artist: David Cooper, American 1929

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Museum Purchase, BR50.535

Human Generated Data

Title

Josef Albers Teaching at the Harvard Graduate School of Design

People

Artist: David Cooper, American 1929

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.6
Person 99.6
Person 98.2
Clothing 85.6
Helmet 85.6
Apparel 85.6
Finger 57.6

Imagga
created on 2022-01-09

senior 40.3
man 39
couple 38.3
people 35.1
male 33
happy 32
home 31.9
person 30.7
smiling 26
together 25.4
elderly 24.9
indoors 24.6
adult 23.6
sitting 23.2
men 21.5
mature 20.5
grandma 19.4
retired 19.4
retirement 18.2
husband 18.1
grandfather 18.1
teacher 18.1
lifestyle 18.1
kin 18
cheerful 17.1
old 16.7
happiness 16.5
enjoying 16.1
women 15.8
family 15.1
drink 15
drinking 14.4
smile 14.3
room 14.1
table 13.8
child 13.8
70s 13.8
older 13.6
barbershop 13.6
married 13.4
love 13.4
wife 13.3
friends 13.2
friendship 13.1
education 13
professional 12.9
classroom 12.9
two 12.7
holding 12.4
world 12.3
restaurant 12.3
group 12.1
camera 12
casual clothing 11.7
60s 11.7
teaching 11.7
portrait 11.6
talking 11.4
school 11.4
looking 11.2
alcohol 11.1
day 11
indoor 11
business 10.9
horizontal 10.9
shop 10.5
worker 10.5
fun 10.5
casual 10.2
aged 10
to 9.7
two people 9.7
studying 9.6
student 9.6
laughing 9.5
togetherness 9.4
meeting 9.4
wine 9.2
leisure 9.1
office 9
team 9
sixties 8.8
businessman 8.8
waist 8.7
marriage 8.5
learning 8.5
relationship 8.4
mercantile establishment 8.4
color 8.3
teamwork 8.3
active 8.3
20s 8.2
businesswoman 8.2
meal 8.1
kitchen 8.1
kid 8
seniors 7.9
grandmother 7.8
boy 7.8
face 7.8
affection 7.7
mother 7.7
class 7.7
health 7.6
loving 7.6
adults 7.6
eating 7.6
females 7.6
desk 7.6
enjoyment 7.5
relaxed 7.5
study 7.5
help 7.4
care 7.4
lady 7.3
children 7.3
food 7.3

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 99.1
person 98.8
man 96.2
text 95.4
human face 94.3
window 82.4
old 72.4
smile 65.4
posing 38.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 50.4%
Sad 19.2%
Confused 12.1%
Angry 8.3%
Surprised 3.9%
Happy 3.3%
Fear 1.7%
Disgusted 1%

AWS Rekognition

Age 62-72
Gender Male, 99.9%
Calm 82.6%
Sad 13.6%
Confused 2.4%
Happy 0.4%
Angry 0.4%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 99.9%
Angry 0%
Sad 0%
Surprised 0%
Confused 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 85.9%
Calm 82.1%
Sad 14.9%
Happy 1.8%
Fear 0.3%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Surprised 0.1%

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Helmet 85.6%

Captions

Microsoft

a vintage photo of a group of people standing in front of a window 84%
a vintage photo of a group of people posing for the camera 83.9%
a group of people standing in front of a window 83.8%