Human Generated Data

Title

Untitled (large machine in factory)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19131

Human Generated Data

Title

Untitled (large machine in factory)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19131

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Train 83.3
Transportation 83.3
Vehicle 83.3
Machine 74.2
Building 63.5
Armory 58
Weapon 58
Weaponry 58

Clarifai
created on 2023-10-22

grinder 99.8
production 98.8
industry 97.9
no person 97.2
people 94.5
indoors 94.3
container 92.7
machine 92.3
conveyer belt 91.1
room 89.1
group 88.9
many 87
equipment 84.3
rack 81.6
business 80.4
automation 80.1
vehicle 79.7
food 79.7
machinery 78.6
adult 78.6

Imagga
created on 2022-03-05

abacus 56.1
calculator 42.9
sequencer 29.3
equipment 28.1
device 27.2
apparatus 26.4
city 23.3
architecture 18.7
travel 18.3
urban 16.6
building 16.5
freight car 15.4
car 14.3
supermarket 14
wheeled vehicle 13.9
business 12.7
tourism 12.4
mercantile establishment 11.7
transportation 11.6
technology 11.1
grocery store 10.6
old 10.4
vehicle 10.3
boat 10.2
shop 10
landmark 9.9
vacation 9.8
industry 9.4
sea 9.4
destination 9.3
water 9.3
historic 9.2
tourist 9.1
shoe shop 9
tower 8.9
ship 8.9
vessel 8.8
buildings 8.5
finance 8.4
machine 8.4
container 8.3
marketplace 7.9
work 7.8
power 7.6
famous 7.4
wagon 7.4
light 7.3
transport 7.3
bank 7.2
financial 7.1
night 7.1
information 7.1
sky 7
glass 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

indoor 91.7
text 91.3
black and white 89
black 79.2
white 68.2
monochrome 61.3
bottle 52.9
old 43.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 85.6%
Calm 64.7%
Fear 11.5%
Sad 9.2%
Angry 4.7%
Surprised 3.6%
Confused 3.2%
Disgusted 1.7%
Happy 1.4%

AWS Rekognition

Age 14-22
Gender Male, 99%
Calm 62.2%
Sad 12.9%
Angry 12.8%
Confused 3.2%
Fear 3%
Happy 2.8%
Disgusted 1.9%
Surprised 1.3%

AWS Rekognition

Age 20-28
Gender Female, 65.8%
Calm 40.8%
Fear 29.2%
Sad 16.1%
Surprised 5%
Angry 4.9%
Disgusted 2%
Confused 1.1%
Happy 0.9%

AWS Rekognition

Age 13-21
Gender Male, 92.4%
Fear 53.1%
Calm 29.9%
Surprised 7.8%
Happy 2.9%
Disgusted 1.9%
Sad 1.8%
Angry 1.5%
Confused 1%

AWS Rekognition

Age 22-30
Gender Male, 94.4%
Calm 75.6%
Fear 20.3%
Sad 1.2%
Confused 0.8%
Surprised 0.7%
Disgusted 0.6%
Happy 0.5%
Angry 0.4%

AWS Rekognition

Age 13-21
Gender Male, 98.1%
Calm 66.6%
Sad 10.3%
Confused 8.9%
Fear 5.4%
Happy 2.9%
Disgusted 2.7%
Angry 2.5%
Surprised 0.7%

Feature analysis

Amazon

Train
Train 83.3%

Categories

Captions

Microsoft
created on 2022-03-05

an old photo of a store 76.8%
old photo of a store 72.3%
a portrait of a store 72.2%

Text analysis

Amazon

IIA
KODAK-