Human Generated Data

Title

Untitled (scientist adjusting lage rods inside reactor)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15616.1

Human Generated Data

Title

Untitled (scientist adjusting lage rods inside reactor)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15616.1

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 94.5
Human 94.5
Clothing 63.8
Apparel 63.8
Finger 60.8
Face 57.1

Clarifai
created on 2023-10-29

people 99.4
monochrome 98.4
music 98
man 98
musician 95.8
art 92.7
instrument 92.4
adult 91.1
administration 90.2
indoors 89.8
one 87.3
technology 86.3
jazz 85.4
leader 85.4
group 85
two 82.7
piano 81.4
chair 80.8
actor 80.3
step 80.1

Imagga
created on 2022-02-05

interior 27.4
architecture 25.5
building 22.2
house 20.9
home 19.1
window 19
wall 17.9
modern 17.5
glass 16.5
light 15.3
device 14.9
room 14.4
equipment 14.3
old 13.9
design 13.5
decor 13.2
furniture 12.8
indoors 12.3
luxury 12
style 11.8
city 11.6
sink 11.6
art 11.5
apartment 11.5
contemporary 11.3
metal 11.2
balcony 10.9
structure 10.3
inside 10.1
lamp 9.9
stove 9.4
antique 9.2
chandelier 9.2
table 9.1
kitchen 8.8
decoration 8.7
residential 8.6
travel 8.4
yellow 7.9
flowers 7.8
ancient 7.8
chair 7.7
stone 7.6
tourism 7.4
church 7.4
historic 7.3
new 7.3
clock 7.3
steel 7.2
basin 7.2
detail 7.2
landmark 7.2
transportation 7.2
religion 7.2
fixture 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

wall 95.9
black and white 94.4
indoor 89.6
text 73.3
white 61.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 65.9%
Sad 98.5%
Calm 0.8%
Happy 0.3%
Fear 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 94.5%

Categories

Imagga

interior objects 98.6%

Text analysis

Amazon

2

Google

2.
2.