Human Generated Data

Title

Untitled (Branchville, Maryland)

Date

November 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1856

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Branchville, Maryland)

People

Artist: Ben Shahn, American 1898 - 1969

Date

November 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1856

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Hat 100
Coat 100
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
People 98.3
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Person 97.5
Handrail 96.6
Adult 96.5
Male 96.5
Man 96.5
Person 96.5
Person 94.9
Cap 94.5
Carpenter 93.7
Adult 90.1
Male 90.1
Man 90.1
Person 90.1
Sword 82.2
Weapon 82.2
Outdoors 79.3
Overcoat 73.3
Face 68.7
Head 68.7
Wood 62.9
Worker 62.7
Construction 56.9
Sun Hat 56.1
Railing 56
Hardhat 55.7
Helmet 55.7
Photography 55.3

Clarifai
created on 2018-05-11

people 99.9
group together 99.3
group 99
adult 98
many 97.9
leader 97.7
man 96.6
administration 94.4
military 92.1
war 90.3
wear 88.6
several 88.2
woman 84.8
five 83.6
soldier 83.4
one 81.7
chair 81.1
uniform 80.5
child 80.5
outfit 80.1

Imagga
created on 2023-10-06

steel drum 100
percussion instrument 91.7
musical instrument 74.7
business 39.5
man 34.3
male 31.9
laptop 31.3
businessman 30.9
office 30.8
people 29.6
adult 27.9
person 27.3
corporate 25.8
suit 25.2
work 25.1
computer 24.9
professional 23.9
job 20.3
working 20.3
executive 20.3
men 18.9
smile 17.8
businesswoman 17.3
sitting 17.2
happy 16.9
smiling 16.6
success 16.1
worker 16
building 15.2
manager 14.9
handsome 14.3
businesspeople 14.2
career 13.2
table 13
occupation 12.8
black 12.6
modern 12.6
boss 12.4
silhouette 12.4
portrait 12.3
desk 12.3
women 11.9
sit 11.3
day 11
alone 11
lifestyle 10.8
attractive 10.5
outdoors 10.5
successful 10.1
engineer 9.9
holding 9.9
cheerful 9.8
looking 9.6
standing 9.6
notebook 9.5
tie 9.5
outside 9.4
happiness 9.4
mature 9.3
outdoor 9.2
city 9.1
confident 9.1
one 9
lady 8.9
color 8.9
construction 8.6
face 8.5
senior 8.4
communication 8.4
window 8.2
20s 8.2
indoor 8.2
group 8.1
indoors 7.9
urban 7.9
hands 7.8
glass 7.8
corporation 7.7
busy 7.7
employee 7.6
speaker 7.5
technology 7.4
single 7.4
friendly 7.3
team 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 94.3
man 90.1
group 69.9
people 63.7
old 52.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Calm 99%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 35-43
Gender Male, 99.2%
Calm 89.8%
Surprised 6.9%
Fear 6%
Confused 2.7%
Sad 2.3%
Angry 2.2%
Happy 1.7%
Disgusted 1.5%

AWS Rekognition

Age 18-24
Gender Male, 99.9%
Surprised 92.9%
Calm 36.1%
Fear 5.9%
Sad 2.6%
Angry 0.5%
Confused 0.3%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 10-18
Gender Female, 61.8%
Sad 24.5%
Fear 21.8%
Calm 18.4%
Disgusted 12%
Angry 11.6%
Happy 9%
Surprised 8.5%
Confused 4%

Microsoft Cognitive Services

Age 46
Gender Male

Feature analysis

Amazon

Adult 98.7%
Male 98.7%
Man 98.7%
Person 98.7%
Sword 82.2%

Categories