Human Generated Data

Title

Max Bill and Edmund Collein at the Bauhaus

Date

1929

People

Artist: T. Lux Feininger, American 1910 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BR71.21.24

Copyright

© T. Lux Feininger

Human Generated Data

Title

Max Bill and Edmund Collein at the Bauhaus

People

Artist: T. Lux Feininger, American 1910 - 2011

Date

1929

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BR71.21.24

Copyright

© T. Lux Feininger

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Glasses 99.2
Accessories 99.2
Accessory 99.2
Person 99.2
Human 99.2
Person 97.9
Finger 94
Face 93
Head 57.7
Icing 57.6
Cake 57.6
Creme 57.6
Cream 57.6
Dessert 57.6
Food 57.6

Clarifai
created on 2018-05-08

people 99.8
adult 99.4
portrait 99
two 98.4
man 98.2
one 97.3
eyewear 96
facial expression 96
monochrome 95.3
wear 94
leader 92.7
group 91.1
veil 90.5
administration 90.1
woman 89.5
three 88
music 87.3
interaction 86.3
actor 85.8
war 85.7

Imagga
created on 2018-05-08

man 37
face 34.8
male 32.7
portrait 31.7
person 30.9
people 22.9
uniform 22.4
military uniform 21.8
clothing 21.5
mustache 21.3
black 18
snorkel 18
smile 17.8
hair 17.4
device 17.2
expression 17.1
happy 16.9
adult 16.9
breathing device 16.5
boy 16.5
handsome 16
head 15.9
casual 15.2
glasses 14.8
old 13.9
men 13.7
youth 13.6
covering 13.3
guy 13.3
look 13.1
looking 12.8
business 12.8
work 12.6
attractive 11.9
serious 11.4
hat 11.3
hand 10.6
fashion 10.5
one 10.4
eyes 10.3
emotion 10.1
model 10.1
goggles 10.1
studio 9.9
kid 9.7
human 9.7
thoughtful 9.7
grandfather 9.5
consumer goods 9.3
phone 9.2
smiling 8.7
happiness 8.6
cute 8.6
professional 8.6
senior 8.4
communication 8.4
student 8.1
child 8.1
eye 8
businessman 7.9
elderly 7.7
close 7.4
closeup 7.4
confident 7.3
weapon 7.2
wet 7.2
women 7.1

Google
created on 2018-05-08

Microsoft
created on 2018-05-08

person 99.6
man 97.6
indoor 96.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-47
Gender Male, 99.5%
Disgusted 1.1%
Sad 2.6%
Confused 8.7%
Calm 65.7%
Surprised 16.7%
Angry 4%
Happy 1.2%

AWS Rekognition

Age 20-38
Gender Male, 91.8%
Calm 80.6%
Sad 7.4%
Confused 1.4%
Disgusted 4.8%
Happy 1.6%
Surprised 0.9%
Angry 3.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Glasses 99.2%
Person 99.2%