Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.663

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.663

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Machine 98.7
Wheel 98.7
Car 96.2
Transportation 96.2
Vehicle 96.2
Person 94.9
Person 91.4
Car 88.7
Wheel 84.3
Person 82.7
Face 80.9
Head 80.9
Car 79.1
Alloy Wheel 74.1
Car Wheel 74.1
Spoke 74.1
Tire 74.1
Tractor 70.8
Outdoors 66.3
Architecture 57.7
Building 57.7
Shelter 57.7
City 57.6
Wheel 57.5
Clothing 57.1
Hat 57.1
People 56.3
Truck 56.3
Photography 56.1
Portrait 56.1
Shorts 55.8
Worker 55.6
Pickup Truck 55.4
T-Shirt 55.4

Clarifai
created on 2018-05-11

people 99.7
group together 98.6
vehicle 98.2
adult 98
man 95.4
group 95.4
military 95.4
war 95.3
two 95.2
several 94.3
administration 93.8
leader 91.6
many 91.3
transportation system 91.1
soldier 87.1
five 87
three 86.6
four 86.3
one 86.2
woman 81.6

Imagga
created on 2023-10-05

man 26.2
person 20.5
people 18.4
outdoors 16.9
passenger 16.7
male 16.4
sitting 16.3
hat 15.4
sky 15.3
happy 14.4
building 14.2
house 14.2
adult 13.6
outdoor 13
smile 12.8
day 12.5
couple 12.2
outside 12
love 11.8
architecture 11.7
smiling 11.6
park 11.3
old 11.1
vehicle 11
happiness 11
city 10.8
home 10.4
two 10.2
cowboy hat 9.3
leisure 9.1
laptop 9.1
vacation 9
family 8.9
working 8.8
together 8.8
grass 8.7
travel 8.4
device 8.4
relationship 8.4
car 8.3
technology 8.2
seller 8
business 7.9
chair 7.9
boy 7.8
standing 7.8
portrait 7.8
clothing 7.8
girlfriend 7.7
industry 7.7
world 7.6
wife 7.6
vacations 7.5
historical 7.5
spectator 7.5
mature 7.4
emotion 7.4
lifestyle 7.2
computer 7.2
equipment 7.2
work 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

tree 100
outdoor 99.9
man 92.6
person 85.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 97.6%
Surprised 7%
Fear 5.9%
Sad 2.2%
Angry 0.5%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 18-24
Gender Male, 95.7%
Calm 99.1%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.4%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 98%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Happy 0.4%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Wheel 98.7%
Car 96.2%

Text analysis

Amazon

25
WELLS
ОАН WELLS
ОАН
ROLLSIDA
1
LUSTI