Human Generated Data

Title

[People disembarking from ship]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.472.8

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People disembarking from ship]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Person 99.1
Human 99.1
Person 98.8
Person 97
Person 96.9
Person 96.6
Person 96.4
People 96.3
Person 95.7
Person 95.4
Person 95.1
Person 92.5
Person 90.9
Person 89.5
Person 89
Sport 88.7
Sports 88.7
Team Sport 88.7
Team 88.7
Person 88.4
Person 87.7
Person 87.1
Person 86.2
Person 84.6
Sphere 84
Person 79.8
Person 71.7
Flooring 65.2
Apparel 60.1
Clothing 60.1
Basketball 56.9
Basketball Court 56
Person 41.7

Clarifai
created on 2021-04-05

people 99.9
group together 99.9
many 99.5
adult 98.7
group 97.8
sports equipment 96.8
two 96.3
several 96.1
man 95.7
athlete 94.6
one 94.5
wear 93.9
three 93.6
competition 92.8
recreation 92.2
outfit 89.6
four 88.1
web 87.1
woman 85.5
uniform 84.7

Imagga
created on 2021-04-05

wall 33.2
architecture 32.6
building 23.1
old 23
city 19.9
structure 18.2
travel 17.6
tourism 17.3
history 16.1
historic 15.6
sky 14.7
landmark 14.4
light 14
monument 14
ancient 13.8
water 13.3
night 13.3
urban 13.1
stone 13
culture 12.8
construction 12
grunge 11.9
dark 11.7
brick 11
religion 10.7
stage 10.3
famous 10.2
landscape 9.7
scene 9.5
house 9.5
roof 9.4
historical 9.4
town 9.3
church 9.2
silhouette 9.1
texture 9
sculpture 8.9
tower 8.9
antique 8.6
industry 8.5
vintage 8.3
tourist 8.2
medieval 7.7
old fashioned 7.6
platform 7.6
power 7.6
religious 7.5
design 7.3
art 7.3
industrial 7.3
black 7.2
paper 7.1
palace 7.1
device 7.1

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

text 81.9
person 78
baseball 71.2
player 69.4
black and white 53.8
watching 42.1

Face analysis

Amazon

Google

AWS Rekognition

Age 9-19
Gender Female, 53.1%
Calm 26.5%
Confused 21.7%
Sad 20.7%
Angry 11.4%
Surprised 9%
Happy 4.9%
Fear 3.9%
Disgusted 1.8%

AWS Rekognition

Age 24-38
Gender Female, 79.8%
Happy 65.8%
Calm 18.1%
Sad 4.3%
Fear 3.1%
Disgusted 3%
Confused 2.9%
Angry 1.7%
Surprised 1%

AWS Rekognition

Age 25-39
Gender Female, 72.2%
Happy 54.2%
Calm 25.2%
Sad 7.3%
Disgusted 6%
Confused 2.3%
Angry 1.8%
Fear 1.7%
Surprised 1.6%

AWS Rekognition

Age 36-52
Gender Male, 71.3%
Calm 51%
Happy 40.4%
Disgusted 3.6%
Surprised 1.5%
Angry 1.5%
Sad 0.9%
Confused 0.8%
Fear 0.2%

AWS Rekognition

Age 19-31
Gender Female, 66.7%
Sad 39.7%
Calm 31.1%
Happy 15.9%
Fear 4.2%
Confused 2.9%
Surprised 2.6%
Angry 2.1%
Disgusted 1.4%

AWS Rekognition

Age 15-27
Gender Female, 74.5%
Sad 85.3%
Happy 6.1%
Fear 3.4%
Calm 3.4%
Angry 1%
Confused 0.3%
Disgusted 0.3%
Surprised 0.1%

AWS Rekognition

Age 50-68
Gender Male, 76.6%
Calm 86.6%
Disgusted 5.5%
Happy 4%
Surprised 1.6%
Angry 0.9%
Sad 0.8%
Confused 0.4%
Fear 0.1%

AWS Rekognition

Age 31-47
Gender Male, 71%
Fear 95.1%
Sad 2.9%
Angry 0.8%
Happy 0.6%
Surprised 0.2%
Calm 0.2%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 36-54
Gender Male, 55.2%
Happy 56.2%
Calm 29.7%
Sad 12.2%
Surprised 0.5%
Confused 0.4%
Angry 0.4%
Fear 0.3%
Disgusted 0.3%

AWS Rekognition

Age 32-48
Gender Female, 51.4%
Sad 83.3%
Calm 4.4%
Happy 4.2%
Fear 2.9%
Angry 2.5%
Confused 1.4%
Disgusted 0.7%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people watching a band on stage in front of a crowd 50.3%
a group of people watching a band on stage in front of a building 50.2%
a group of people playing instruments and performing on a stage 38.4%