Human Generated Data

Title

Josef Albers Teaching at the Harvard Graduate School of Design

Date

1950

People

Artist: David Cooper, American 1929

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Museum Purchase, BR50.534

Human Generated Data

Title

Josef Albers Teaching at the Harvard Graduate School of Design

People

Artist: David Cooper, American 1929

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Museum Purchase, BR50.534

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.1
Human 99.1
Person 98.7
Clothing 97.9
Apparel 97.9
Shirt 89.3
Sitting 85.1
Suit 81
Coat 81
Overcoat 81
Face 72.3
Finger 63.5
Tie 58.9
Accessories 58.9
Accessory 58.9
Indoors 55.5
Sleeve 55.3
Monitor 55.2
Electronics 55.2
Screen 55.2
Display 55.2

Clarifai
created on 2023-10-25

people 100
group 99.7
adult 98.8
group together 98
man 97.7
three 96.8
two 96.1
portrait 96
music 95.1
woman 93.2
monochrome 93.2
administration 93
several 92.3
four 91.3
actor 91
facial expression 89.8
retro 87.9
musician 85
leader 84.8
room 81.2

Imagga
created on 2022-01-09

man 46.4
male 31.4
people 31.2
person 26.2
child 20.9
family 20.5
bow tie 20.5
couple 20
happy 18.8
smiling 18.1
portrait 17.5
love 16.6
adult 16.5
sibling 16.4
necktie 16.2
happiness 15.7
men 15.5
sitting 14.6
indoors 14.1
world 14
black 13.8
dad 13.6
home 13.6
face 13.5
business 12.8
father 12.2
handsome 11.6
lifestyle 11.6
holding 11.6
kid 11.5
businessman 11.5
together 11.4
cheerful 11.4
women 11.1
casual 11
parent 10.8
patient 10.8
life 10.5
boy 10.4
hands 10.4
clothing 10.3
work 10.2
smile 10
professional 10
mother 9.6
photographer 9.6
husband 9.5
wife 9.5
corporate 9.5
two 9.3
horizontal 9.2
occupation 9.2
grandfather 9.1
care 9.1
group 8.9
office 8.8
looking 8.8
nurse 8.8
garment 8.5
room 8.5
senior 8.4
relationship 8.4
mature 8.4
old 8.4
leisure 8.3
human 8.3
worker 8.2
dress 8.1
suit 8.1
to 8
job 8
interior 8
color 7.8
head 7.6
executive 7.5
hospital 7.5
technology 7.4
little 7.1
day 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.9
human face 98
wall 97.4
clothing 96.9
smile 92.4
man 92.1
indoor 85.7
text 80.6
glasses 76.9
people 59.3
black and white 51.6
old 40.6
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 56-64
Gender Male, 99.8%
Sad 49.7%
Calm 21.1%
Confused 13.2%
Surprised 10.1%
Angry 2.5%
Disgusted 2%
Fear 0.7%
Happy 0.6%

AWS Rekognition

Age 37-45
Gender Male, 100%
Calm 95%
Confused 1.3%
Sad 1%
Happy 0.7%
Angry 0.6%
Fear 0.6%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 37-45
Gender Female, 98.5%
Happy 97.8%
Calm 0.6%
Surprised 0.6%
Angry 0.5%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%
Sad 0.1%

AWS Rekognition

Age 20-28
Gender Male, 99.7%
Happy 94.2%
Calm 2.5%
Sad 1%
Surprised 0.7%
Angry 0.6%
Confused 0.5%
Disgusted 0.3%
Fear 0.2%

Microsoft Cognitive Services

Age 41
Gender Female

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Suit 81%

Categories