Human Generated Data

Title

[Men and women playing a game on ship deck]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.466.25

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Men and women playing a game on ship deck]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.466.25

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.5
Person 99.5
Person 97.7
Pedestrian 96.7
Clothing 89.2
Apparel 89.2
Person 84.9
Person 83
Building 80.9
Person 77.3
Architecture 73.6
Shorts 73.1
Asphalt 71.6
Tarmac 71.6
Leisure Activities 70.9
Path 70.8
People 63.7
Photography 61.8
Photo 61.8
Guitarist 60.1
Guitar 60.1
Musical Instrument 60.1
Musician 60.1
Performer 60.1
Column 57.9
Pillar 57.9
Stage 57.9
Sailor Suit 57.6
Road 57
Dance Pose 56.6

Clarifai
created on 2019-11-19

people 99.8
one 97.5
adult 97.5
man 96.6
monochrome 95.3
woman 95.3
two 93.4
group together 92.9
music 92.8
wear 91.3
recreation 87.8
street 87.5
group 86.8
indoors 86.3
vehicle 86
action 84.6
dancing 83.2
dancer 79.9
skill 79.5
musician 78.9

Imagga
created on 2019-11-19

dark 22.5
man 21.5
person 20.7
people 20.1
black 18.3
adult 17.5
television 16.4
model 16.3
sexy 16.1
body 16
one 15.7
light 14.7
portrait 14.2
studio 13.7
hair 13.5
male 13.5
erotic 13.4
fashion 12.8
car 12.3
world 12
equipment 11.4
attractive 11.2
night 10.7
device 10.5
human 10.5
urban 10.5
style 10.4
seat 10.3
skin 10.2
sensual 10
vehicle 9.7
silhouette 9.1
pretty 9.1
job 8.8
water 8.7
work 8.6
automobile 8.6
motion 8.6
telecommunication system 8.4
city 8.3
inside 8.3
art 8.3
sensuality 8.2
exercise 8.2
lady 8.1
wet 8
metal 8
worker 8
looking 8
posing 8
women 7.9
travel 7.7
dance 7.7
industry 7.7
conveyance 7.7
grunge 7.7
stretcher 7.5
enjoy 7.5
fun 7.5
dancer 7.4
business 7.3
industrial 7.3
cockpit 7.2
transportation 7.2
face 7.1
cadaver 7.1
steel 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

black and white 94.9
text 92.7
indoor 86.6
street 84.4
monochrome 81.3
clothing 70.2
person 63.5
footwear 59.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-26
Gender Female, 50.1%
Disgusted 49.5%
Sad 50.1%
Angry 49.5%
Calm 49.6%
Surprised 49.5%
Fear 49.7%
Confused 49.5%
Happy 49.5%

AWS Rekognition

Age 15-27
Gender Male, 50.1%
Calm 49.6%
Sad 49.9%
Confused 49.5%
Angry 49.5%
Happy 49.5%
Surprised 49.5%
Fear 49.9%
Disgusted 49.5%

AWS Rekognition

Age 15-27
Gender Female, 50.2%
Angry 49.5%
Disgusted 49.5%
Fear 50.3%
Sad 49.6%
Surprised 49.5%
Calm 49.5%
Confused 49.5%
Happy 49.5%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft
created on 2019-11-19

a person standing in a room 60.7%
a person in a white room 52.8%