Human Generated Data

Title

[Passengers on ship]

Date

unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.149.20

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Passengers on ship]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

unknown

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-04

Person 99.7
Human 99.7
Person 98.6
Person 98.5
Shoe 98.4
Clothing 98.4
Apparel 98.4
Footwear 98.4
Person 92.1
Person 92.1
Person 90.7
Person 73.5
Person 69.2
Shoe 65.8
Coat 61
Overcoat 61
Undershirt 59.1
Person 58.1
Shorts 56.9
Path 56.6
Meal 56
Food 56
Pedestrian 56
Shoe 53
Person 49.8

Clarifai
created on 2021-04-04

people 99.9
group 98.4
group together 98.4
street 96.9
man 96.8
adult 96.7
music 94.4
many 94.4
woman 93.7
monochrome 92.4
wear 90.6
vehicle 88.6
recreation 88.3
administration 85.6
musician 85
several 84.1
child 83.8
police 82.5
boy 82.3
war 81.6

Imagga
created on 2021-04-04

percussion instrument 40.9
marimba 35.3
musical instrument 33.2
man 31.6
people 27.3
person 21.4
male 20.6
adult 17.7
barbershop 16.7
industrial 16.3
shop 16
industry 15.4
light 14.7
men 14.6
work 14.4
dark 14.2
worker 13.5
steel 12.4
portrait 12.3
black 12
safety 12
old 11.8
factory 11.6
business 11.5
working 11.5
mercantile establishment 11.4
device 11.3
room 11.3
sitting 11.2
construction 11.1
mask 10.9
city 10.8
fashion 10.5
metal 10.5
building 10.2
model 10.1
modern 9.8
chair 9.8
job 9.7
one 9.7
sexy 9.6
urban 9.6
machine 9.5
passion 9.4
fire 9.4
power 9.2
inside 9.2
house 9.2
occupation 9.2
protection 9.1
equipment 9
manufacturing 8.8
lifestyle 8.7
skill 8.7
office 8.4
hot 8.4
street 8.3
sensual 8.2
night 8
welder 7.9
welding 7.9
standing 7.8
labor 7.8
helmet 7.7
repair 7.7
world 7.6
place of business 7.5
happy 7.5
fun 7.5
silhouette 7.4
technology 7.4
heat 7.4
lady 7.3
home 7.2
love 7.1
architecture 7

Google
created on 2021-04-04

Microsoft
created on 2021-04-04

clothing 97.5
person 96.3
black and white 95.4
man 91.2
text 78
footwear 76.7
monochrome 74.1
concert 67.6
street 65.7

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Female, 81.9%
Sad 27.6%
Calm 26.6%
Surprised 15.6%
Happy 15.2%
Fear 5.7%
Confused 5.6%
Angry 2.4%
Disgusted 1.3%

AWS Rekognition

Age 36-52
Gender Male, 52.3%
Sad 58%
Calm 23.9%
Happy 6.6%
Confused 5%
Fear 2.4%
Angry 2%
Surprised 1.6%
Disgusted 0.5%

Feature analysis

Amazon

Person 99.7%
Shoe 98.4%

Captions

Microsoft

a group of people in a room 80.5%
a group of people standing in a room 75.5%
a group of people sitting in a room 64.3%