Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3289

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3289

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Sun Hat 100
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 97.4
Male 97.4
Man 97.4
Person 97.4
Person 96.7
Adult 96.6
Male 96.6
Man 96.6
Person 96.6
Adult 95.8
Male 95.8
Man 95.8
Person 95.8
Adult 95.7
Male 95.7
Man 95.7
Person 95.7
Person 94.8
Person 94.5
Person 93.7
Face 83.6
Head 83.6
Person 83.5
Outdoors 82.3
Person 76.1
Hat 73.8
Hat 71.1
Person 70.9
Nature 66.9
Machine 66.9
Wheel 66.9
Hat 59.6
Person 58.9
Hat 58.6
Transportation 56.9
Vehicle 56.9
Countryside 56
Farm 56
Harvest 56
Rural 56

Clarifai
created on 2018-05-10

people 99.9
group 98
adult 97.8
lid 97.4
transportation system 97.2
group together 96.8
vehicle 96.6
many 95.8
man 95.8
industry 91.7
wear 89
veil 87.4
woman 87.3
watercraft 85.4
military 85
grinder 83.5
commerce 81.2
merchant 79.9
dig 79.5
two 79.5

Imagga
created on 2023-10-07

farmer 72.8
person 34.4
man 26.2
machine 24.5
work 21.3
farm 20.5
working 20.3
male 19.8
field 19.2
hat 18.8
thresher 18.3
people 17.3
industry 16.2
worker 16.1
cowboy 16
farm machine 15.4
grass 15
agriculture 14.9
summer 14.8
construction 14.5
job 14.1
rural 14.1
old 13.9
plow 13.5
outdoors 13.4
tool 13.2
industrial 12.7
men 12
hay 11.7
equipment 11.5
building 11.5
country 11.4
farming 11.4
boy 11.3
sugar 11.3
device 11
tractor 11
active 10.9
vehicle 10.9
outdoor 10.7
crop 10.3
site 10.3
plant 10.3
sky 10.2
occupation 10.1
countryside 10
factory 10
transportation 9.9
machinery 9.7
wheel 9.4
senior 9.4
builder 9.3
transport 9.1
adult 9
environment 9
together 8.8
labor 8.8
structure 8.7
sitting 8.6
two 8.5
house 8.4
leisure 8.3
land 8.3
landscape 8.2
happy 8.1
home 8
steel 7.9
yellow 7.9
portrait 7.8
straw 7.7
helmet 7.7
dirt 7.6
horse 7.6
harvest 7.5
wood 7.5
shovel 7.5
shirt 7.5
engineer 7.4
earth 7.3
uniform 7.3
protection 7.3
activity 7.2
handsome 7.1
animal 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.7
outdoor 91.9
group 83.1
people 60.3
old 58.7
working 55.5
several 10.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 36-44
Gender Male, 86.5%
Angry 96%
Surprised 6.7%
Fear 5.9%
Sad 2.2%
Calm 2%
Disgusted 0.5%
Confused 0.2%
Happy 0.1%

AWS Rekognition

Age 34-42
Gender Male, 100%
Calm 75.4%
Surprised 10.7%
Confused 7.8%
Fear 6.2%
Angry 3.5%
Sad 3.2%
Happy 1.3%
Disgusted 1.3%

AWS Rekognition

Age 35-43
Gender Female, 95%
Calm 89.7%
Surprised 8.7%
Fear 6%
Disgusted 2.9%
Sad 2.2%
Angry 1.2%
Happy 0.9%
Confused 0.9%

AWS Rekognition

Age 43-51
Gender Male, 100%
Calm 99.1%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 38-46
Gender Male, 96.6%
Calm 99.1%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Angry 0.1%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 53-61
Gender Male, 99.9%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 35-43
Gender Male, 77.8%
Calm 81%
Surprised 21.7%
Fear 6%
Sad 2.3%
Disgusted 0.5%
Angry 0.5%
Happy 0.4%
Confused 0.3%

AWS Rekognition

Age 56-64
Gender Male, 99.9%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 86
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%
Hat 73.8%
Wheel 66.9%

Categories

Imagga

paintings art 97%