Human Generated Data

Title

[Gerhard Marcks and Lux Feininger]

Date

1950

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.355.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Gerhard Marcks and Lux Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.355.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Human 99.7
Person 99.7
Face 83.3
Nature 83.2
Smoke 82.4
Musician 75.9
Musical Instrument 75.9
Leisure Activities 68.1
Piano 67.1
Performer 67.1
Pianist 67.1
Fog 63.7
Photography 61.5
Portrait 61.5
Photo 61.5
Smog 56.2
Pollution 55.4

Clarifai
created on 2019-05-29

people 99.9
adult 98.9
one 98.9
portrait 98.5
man 95.4
monochrome 92.1
music 89.3
street 87.5
profile 87.1
administration 86.7
wear 85.8
woman 81
side view 80.9
boy 77.3
musician 77.3
building 75.1
two 74.9
sepia 74.7
war 73.6
actor 73.4

Imagga
created on 2019-05-29

man 41.7
person 37.6
male 35.6
call 31.9
businessman 30
people 29.6
business 29.2
phone 27.7
adult 27.2
face 25.6
office 23.3
mobile 21.7
portrait 21.4
professional 20.4
smile 20
happy 18.2
serious 18.1
manager 17.7
communication 17.6
binoculars 17.6
looking 17.6
confident 17.3
corporate 17.2
senior 16.9
telephone 16.6
attractive 16.1
hair 15.9
expression 15.4
hand 15.2
shirt 15
mature 14.9
suit 14.8
hairdresser 14.7
handsome 14.3
talking 14.3
businesspeople 14.2
tie 14.2
one 14.2
executive 14.1
old 13.9
lifestyle 13
alone 12.8
casual 12.7
conversation 12.6
work 12.6
happiness 12.5
talk 12.5
elderly 12.5
cell 12.4
wind instrument 11.9
device 11.8
cellphone 11.7
smiling 11.6
look 11.4
human 11.3
optical instrument 11.1
harmonica 10.9
world 10.7
couple 10.5
thinking 10.5
eyes 10.3
sitting 10.3
men 10.3
day 10.2
pretty 9.8
lady 9.7
job 9.7
depression 9.7
outdoors 9.7
joy 9.2
fashion 9.1
guy 9
free-reed instrument 8.7
love 8.7
model 8.6
desk 8.5
outdoor 8.4
glasses 8.3
groom 8.3
camera 8.3
emotion 8.3
holding 8.3
friendly 8.2
businesswoman 8.2
technology 8.2
musical instrument 8.2
success 8.1
sexy 8
computer 8
working 8
indoors 7.9
instrument 7.9
hands 7.8
middle aged 7.8
busy 7.7
sky 7.7
communicate 7.7
career 7.6
indoor 7.3
worker 7.3
laptop 7.3
scholar 7.2
black 7.2
modern 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 99.4
man 99
human face 92.7
black and white 91.9
standing 91
outdoor 90.4
clothing 87.8
sky 86.9
old 72.4
suit 66.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 27-44
Gender Male, 92.7%
Disgusted 9.9%
Surprised 3.9%
Angry 8%
Sad 12.9%
Happy 28.1%
Confused 4%
Calm 33.2%

AWS Rekognition

Age 23-38
Gender Female, 65.2%
Happy 2.4%
Confused 4.6%
Sad 75.4%
Calm 7.9%
Surprised 2.8%
Disgusted 2.8%
Angry 4.1%

Microsoft Cognitive Services

Age 34
Gender Male

Feature analysis

Amazon

Person 99.7%

Categories