Human Generated Data

Title

[Band marching in parade, Stockholm]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.195.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Band marching in parade, Stockholm]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.195.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 99.3
Person 99.3
Person 99.2
Person 98
Person 93.7
Pedestrian 89.7
Person 88.6
Person 85.4
Person 83.7
Person 82.7
Person 77.2
Person 75.1
Person 73.6
Person 73
Person 70.7
Clothing 67.9
Apparel 67.9
Urban 61.8
Text 58.8
Meal 58.6
Food 58.6
Screen 58.4
LCD Screen 58.4
Electronics 58.4
Monitor 58.4
Display 58.4
Shorts 57.4
Restaurant 56.7
Cafeteria 56.7

Clarifai
created on 2019-11-18

people 99.9
many 99.3
group together 99.2
group 98.9
adult 97.5
man 94.5
one 94.5
crowd 93.9
vehicle 92.9
administration 92.7
wear 92.6
military 89.1
woman 88.6
war 87.2
several 86.7
transportation system 86.5
monochrome 86.1
spectator 86.1
leader 85.1
two 82.1

Imagga
created on 2019-11-18

city 26.6
urban 22.7
building 22.3
people 20.1
world 19.2
old 18.1
street 14.7
architecture 14.3
man 14.2
shop 12.8
travel 12
train 11.7
transportation 11.7
person 11.2
adult 11.2
life 11.2
business 10.9
passenger 10.8
mercantile establishment 10.5
scene 10.4
portrait 10.3
light 10
station 10
tourism 9.9
barbershop 9.9
window 9.8
human 9.7
black 9.6
women 9.5
men 9.4
wall 9.4
historic 9.2
art 9.1
vintage 9.1
structure 9
antique 8.6
ancient 8.6
crowd 8.6
journey 8.5
house 8.5
transport 8.2
child 7.9
culture 7.7
roof 7.6
male 7.6
buildings 7.6
dark 7.5
traditional 7.5
lifestyle 7.2
night 7.1

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

text 97.7
person 95.7
clothing 88.9
man 85.8
black and white 73.7
crowd 0.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-69
Gender Male, 52.5%
Angry 45%
Sad 54.6%
Fear 45%
Disgusted 45%
Happy 45%
Surprised 45%
Calm 45.4%
Confused 45%

AWS Rekognition

Age 19-31
Gender Male, 50.3%
Calm 49.9%
Surprised 49.5%
Happy 49.6%
Confused 49.5%
Disgusted 49.5%
Fear 49.5%
Angry 49.5%
Sad 49.9%

AWS Rekognition

Age 9-19
Gender Female, 50.1%
Surprised 49.5%
Disgusted 49.5%
Fear 49.8%
Happy 49.6%
Angry 49.5%
Confused 49.5%
Calm 49.6%
Sad 50%

AWS Rekognition

Age 44-62
Gender Male, 50.1%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%
Fear 49.7%
Calm 49.6%
Sad 49.9%
Angry 49.8%
Surprised 49.5%

AWS Rekognition

Age 30-46
Gender Male, 50.2%
Calm 49.7%
Confused 49.5%
Disgusted 49.5%
Angry 49.5%
Happy 49.5%
Surprised 49.5%
Sad 50.1%
Fear 49.6%

AWS Rekognition

Age 29-45
Gender Female, 50.2%
Happy 49.5%
Sad 49.6%
Fear 49.7%
Angry 49.5%
Disgusted 49.5%
Surprised 50%
Calm 49.5%
Confused 49.6%

AWS Rekognition

Age 12-22
Gender Male, 50%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Sad 50.4%
Angry 49.5%
Happy 49.5%
Calm 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 99.3%

Text analysis

Amazon

ACKMAN
tAR ACKMAN
tAR

Google

IR BCKMAT
IR
BCKMAT