Human Generated Data

Title

Untitled (man transferring hay from truck to trough while cattle surround him)

Date

1958

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6468

Human Generated Data

Title

Untitled (man transferring hay from truck to trough while cattle surround him)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6468

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Human 99.4
Person 99.4
Nature 89.7
Outdoors 88.7
Transportation 72.9
Vehicle 72.9
Snow 70.2
Ice 68.6
People 61.5
Alloy Wheel 58.1
Spoke 58.1
Machine 58.1
Wheel 58.1
Winter 57.7
Weather 55.8

Clarifai
created on 2019-03-22

people 99.7
adult 97.8
vehicle 97.4
man 96.9
group 96.2
transportation system 95.1
group together 92.8
woman 90.2
many 88.8
military 87.7
war 87.1
wear 84.1
one 82
recreation 81.9
monochrome 81.5
aircraft 80.1
veil 76.9
actor 76.9
music 76.7
leader 76.3

Imagga
created on 2019-03-22

building 33.1
structure 24.5
travel 21.8
greenhouse 21.6
architecture 21.4
negative 18.5
old 16.7
water 16
sky 15.9
city 15.8
tourism 15.7
film 15
fence 14.5
landscape 13.4
scenery 12.6
barrier 12.1
industrial 11.8
sea 11.7
picket fence 11.7
truck 11.7
wall 11.2
stone 11.1
house 10.9
ocean 10.8
black 10.2
town 10.2
photographic paper 10.1
scenic 9.6
scene 9.5
winter 9.4
island 9.2
coast 9
river 8.9
roof 8.7
ancient 8.6
industry 8.5
obstruction 8.5
boat 8.4
street 8.3
vacation 8.2
history 8
garbage truck 8
light 8
glass 7.8
houses 7.7
construction 7.7
device 7.7
grunge 7.7
clouds 7.6
motor vehicle 7.5
traditional 7.5
holiday 7.2
tower 7.2
vehicle 7
wheeled vehicle 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Happy 45.4%
Calm 53.5%
Confused 45.1%
Disgusted 45.1%
Sad 45.2%
Surprised 45.5%
Angry 45.3%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft
created on 2019-03-22

an old photo of a person 54.7%
a black and white photo of a person 43.4%

Text analysis

Amazon

YT33A2
MJ13
Ln