Human Generated Data

Title

Untitled (two boys with model train set)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17751

Human Generated Data

Title

Untitled (two boys with model train set)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17751

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Room 92.7
Indoors 92.7
Person 87.9
Person 77.6
Clothing 76.1
Apparel 76.1
Person 70.5
Furniture 68.6
Dressing Room 59.4
Living Room 59.2

Clarifai
created on 2023-10-29

people 99.9
adult 99.2
monochrome 98.1
group 97.7
two 97.5
wear 97.3
man 96.1
one 94.6
vehicle 94.3
industry 93.3
administration 93.2
three 92.7
woman 92.5
furniture 91.8
group together 91.3
several 90.6
leader 87.7
illustration 86.5
many 86.3
outfit 85.7

Imagga
created on 2022-02-26

musical instrument 20.3
people 20.1
sax 19.8
old 17.4
male 17
person 16.6
man 16.4
architecture 16.4
building 15.9
men 14.6
construction 14.5
wind instrument 13.9
negative 13.9
business 13.3
design 12.9
work 12.8
film 12.3
city 11.6
silhouette 11.6
accordion 11.5
black 11.4
equipment 11.1
retro 10.6
keyboard instrument 10.5
portrait 10.3
decoration 10.2
grunge 10.2
sky 10.2
team 9.8
drawing 9.8
ancient 9.5
symbol 9.4
house 9.2
travel 9.1
art 9.1
vintage 9.1
job 8.8
businessman 8.8
home 8.8
lab 8.7
couple 8.7
sculpture 8.6
scene 8.6
adult 8.6
plaything 8.4
structure 8.3
worker 8.2
technology 8.2
statue 8.2
professional 8.1
history 8
water 8
interior 7.9
urban 7.9
laboratory 7.7
modern 7.7
youth 7.7
clothing 7.6
biology 7.6
cityscape 7.6
antique 7.5
room 7.5
site 7.5
human 7.5
floor 7.4
tourism 7.4
occupation 7.3
new 7.3
photographic paper 7.3
smiling 7.2
medical 7.1

Google
created on 2022-02-26

Black 89.7
Table 88.2
Black-and-white 85.2
Chair 84
Style 83.9
Monochrome photography 73.5
Monochrome 73.3
Curtain 70.9
Pattern 70.7
Rectangle 69.1
Event 69
Visual arts 65.1
Vintage clothing 64.8
Cooking 63.9
Art 62.8
Stock photography 62.6
Tablecloth 61.2
Room 60.9
Font 60.3
Sitting 60.1

Microsoft
created on 2022-02-26

text 99.5
black and white 74.5
person 66.9
clothing 66.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 53.3%
Happy 45.6%
Calm 30.5%
Sad 13.4%
Surprised 4.8%
Fear 3%
Disgusted 1.2%
Confused 0.8%
Angry 0.7%

AWS Rekognition

Age 36-44
Gender Male, 97.8%
Calm 56.3%
Sad 39%
Disgusted 1.5%
Confused 1.1%
Happy 0.8%
Angry 0.7%
Fear 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 87.9%
Person 77.6%
Person 70.5%

Categories

Captions

Microsoft
created on 2022-02-26

an old photo of a man 74%
old photo of a man 71.4%
an old photo of a man 63.6%

Text analysis

Amazon

19.
KODAK
-
TAPA2

Google

19. VAA KODVK
19.
VAA
KODVK