Human Generated Data

Title

Untitled (Middleboro, Kentucky)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3358

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Middleboro, Kentucky)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.9
Person 99.9
Person 99.7
Person 99.7
Person 99.7
Person 99.5
Person 99
Military 98.8
Military Uniform 98.3
Person 98.2
Officer 93.7
Person 89.6
Army 88.8
Armored 88.8
Person 85.2
Person 82.8
Soldier 82.3
People 80.6
Person 79.4
Clothing 78.5
Apparel 78.5
Person 70.6
Crowd 63.4
Troop 57.1

Imagga
created on 2021-12-15

uniform 29.8
stretcher 29.7
military uniform 26.4
pedestrian 26.2
litter 23.7
man 23.5
people 22.9
conveyance 20.2
travel 19.7
clothing 19.6
city 17.4
military 16.4
war 16.4
snow 16.3
person 16.2
winter 16.2
male 14.9
adult 14.2
sport 14.2
urban 13.1
tourist 12.5
attendant 12.4
active 12
covering 12
men 12
outdoors 11.9
speed 11.9
transport 11.9
soldier 11.7
walking 11.4
cold 11.2
street 11
consumer goods 11
mountain 10.9
transportation 10.8
old 10.4
gun 10.3
women 10.3
weapon 10.1
leisure 10
outdoor 9.9
army 9.7
sky 9.6
protection 9.1
danger 9.1
activity 9
business 8.5
action 8.3
fun 8.2
to 8
engineer 8
lifestyle 7.9
crowd 7.7
extreme 7.7
walk 7.6
horse 7.6
journey 7.5
human 7.5
tourism 7.4
rifle 7.4
vacation 7.4
station 7.3
industrial 7.3
black 7.2
equipment 7.1
vehicle 7.1
day 7.1
season 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

outdoor 99.3
person 98.9
clothing 97.8
man 95.9
text 94.7
people 92.5
group 91.8
bicycle 59.5
black and white 57.3
footwear 57
posing 40.8
crowd 0.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-50
Gender Male, 92.6%
Calm 96%
Surprised 1.5%
Sad 0.9%
Confused 0.6%
Angry 0.5%
Happy 0.4%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 23-35
Gender Male, 96.1%
Happy 65.4%
Calm 23.8%
Surprised 6.4%
Fear 1.2%
Angry 1.1%
Sad 0.9%
Disgusted 0.8%
Confused 0.5%

AWS Rekognition

Age 24-38
Gender Male, 97.9%
Calm 95.2%
Surprised 2.6%
Happy 1.1%
Angry 0.3%
Sad 0.3%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 32-48
Gender Male, 98.3%
Calm 82.6%
Angry 6.4%
Disgusted 3.3%
Sad 2.8%
Confused 2.1%
Fear 1.1%
Happy 0.9%
Surprised 0.8%

AWS Rekognition

Age 47-65
Gender Male, 98.8%
Sad 87.8%
Calm 10.3%
Angry 1%
Confused 0.5%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 36-54
Gender Male, 96.4%
Calm 48.3%
Sad 46.8%
Happy 2%
Angry 1.1%
Fear 1%
Confused 0.5%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 44-62
Gender Female, 73.2%
Calm 97%
Sad 1.5%
Confused 1.1%
Happy 0.2%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 35-51
Gender Male, 97.9%
Calm 93.8%
Sad 4.3%
Angry 1%
Fear 0.3%
Happy 0.2%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 24-38
Gender Male, 88%
Calm 51.8%
Sad 41.2%
Surprised 2.2%
Fear 1.9%
Happy 1.4%
Angry 1%
Confused 0.5%
Disgusted 0.1%

AWS Rekognition

Age 36-54
Gender Male, 98.8%
Calm 95%
Angry 2.3%
Happy 0.9%
Sad 0.8%
Confused 0.7%
Surprised 0.4%
Fear 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Captions

Microsoft

a group of people standing in front of a building 98.1%
a group of people standing in front of a crowd posing for the camera 96.6%
a group of people posing for a photo 96.5%

Text analysis

Amazon

FRU
CITY
ALL
CLASSIC CITY
CLASSIC
DANCE
YORK
NEW
SECOND
DANCE HA
ON SECOND
A
HA
ON
MANSFIELD
HARDWARF
VEGE
:-
PARONARE
DIXIE
2074
2074 as
M
as

Google

YORK CLASSIC CITY DANCE HA ON SECON DRE HARDWARF FRU VEGE ALL I
YORK
VEGE
HA
ON
DRE
ALL
I
DANCE
CLASSIC
CITY
SECON
HARDWARF
FRU