Human Generated Data

Title

Untitled (two men in front of large machine)

Date

1951

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16733

Human Generated Data

Title

Untitled (two men in front of large machine)

People

Artist: Lucian and Mary Brown, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16733

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.8
Human 99.8
Person 99.3
Clock Tower 89.9
Architecture 89.9
Tower 89.9
Building 89.9
Clock Tower 88.8
Clothing 74.4
Apparel 74.4
People 67.4
Shorts 64.1
Face 56.3
Safe 56.2
Crypt 56.1

Clarifai
created on 2023-10-29

people 99.9
adult 97.7
group together 97.6
two 96
man 95.2
group 95.1
street 91.1
one 90.6
three 90.1
room 89.7
monochrome 89.7
vehicle 89.3
woman 88.3
wear 87.4
child 87
door 85.6
war 85
transportation system 83.6
aircraft 82.9
four 82.7

Imagga
created on 2022-02-26

safe 100
strongbox 100
box 76.6
container 52.6
locker 29.2
fastener 22.6
city 19.1
old 18.8
building 17.5
restraint 17
architecture 15.6
device 13.7
business 13.4
industry 12.8
house 11.7
urban 11.3
ancient 11.2
vintage 10.7
travel 10.6
storage 10.5
antique 10.4
wall 10.3
grunge 10.2
paper 10.2
industrial 10
equipment 9.8
interior 9.7
light 9.4
retro 9
metal 8.8
room 8.7
people 8.4
office 8.2
warehouse 8.2
steel 7.9
working 7.9
work 7.8
scene 7.8
empty 7.7
factory 7.7
old fashioned 7.6
power 7.5
man 7.4
street 7.4
aged 7.2
station 7.1
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.3
train 69.2
black and white 61.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 95%
Sad 58.8%
Calm 13.8%
Happy 12%
Confused 8.1%
Disgusted 2.7%
Fear 1.8%
Angry 1.6%
Surprised 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Clock Tower
Person 99.8%
Person 99.3%
Clock Tower 89.9%
Clock Tower 88.8%

Categories

Captions

Microsoft
created on 2022-02-26

an old photo of a person 80.2%
old photo of a person 80.1%
a old photo of a person 75.4%

Text analysis

Amazon

BROS
AS
YТЭ-А
инол
it u
it u инол all SAMUY
SAMUY
all

Google

YT3RA2-A BROS)
YT3RA2-A
BROS)