Human Generated Data

Title

Untitled (two men with cotton bushels)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2705

Human Generated Data

Title

Untitled (two men with cotton bushels)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Human 99.8
Person 99.8
Person 99.7
Footwear 91.4
Clothing 91.4
Apparel 91.4
Shoe 91.4
Wheel 88.6
Machine 88.6
Transportation 85.1
Vehicle 85.1
Spoke 84.4
Shoe 76.3
Wheel 75.6
Tire 75.5
Skateboard 72.8
Sport 72.8
Sports 72.8
Alloy Wheel 72.3
Wagon 63.4
People 61
Shoe 59.7
Car Wheel 57.6
Building 55.6
Person 42.2

Imagga
created on 2022-01-16

man 28.9
shopping cart 23.6
handcart 20.8
wheeled vehicle 19.8
male 19.1
musical instrument 18
person 16.3
people 16.2
transportation 14.3
old 13.9
work 13.9
vehicle 13.8
adult 13.8
accordion 13.1
men 12.9
car 12.7
container 12.6
city 12.5
military 11.6
outdoor 11.5
travel 11.3
building 11.1
industry 11.1
industrial 10.9
newspaper 10.9
keyboard instrument 10.8
wind instrument 10.1
street 10.1
protection 10
machine 10
conveyance 9.6
sky 9.6
uniform 9.4
occupation 9.2
transport 9.1
business 9.1
road 9
human 9
product 8.8
urban 8.7
scene 8.6
automobile 8.6
architecture 8.6
two 8.5
outdoors 8.2
job 8
working 8
grass 7.9
soldier 7.8
seller 7.7
clothing 7.7
war 7.7
auto 7.7
power 7.6
landscape 7.4
sport 7.4
historic 7.3
device 7.3
danger 7.3
worker 7.2
metal 7.2
structure 7.2
equipment 7.1
summer 7.1

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

black and white 95.4
text 94.1
person 90.3
outdoor 87.6
clothing 87.1
monochrome 80
street 75
man 64.4

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 64.6%
Calm 64.7%
Happy 20.9%
Sad 6.9%
Surprised 2.4%
Disgusted 2.1%
Angry 1.5%
Confused 0.9%
Fear 0.6%

AWS Rekognition

Age 24-34
Gender Female, 99.8%
Confused 54.3%
Sad 19.7%
Calm 14.3%
Surprised 4%
Fear 2.5%
Disgusted 2%
Happy 1.6%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 91.4%
Wheel 88.6%
Skateboard 72.8%

Captions

Microsoft

a group of people standing next to a window 57.8%
a group of people standing in front of a window 54.5%
a person standing next to a window 45%

Text analysis

Amazon

Coca-Cola
HOMA
MOR

Google

M
XAGON
I
--
M」Iヨ--YT33A2--XAGON
--YT33A2