Human Generated Data

Title

Untitled (three people outside Suffolk Harness Co. shop)

Date

1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1872

Human Generated Data

Title

Untitled (three people outside Suffolk Harness Co. shop)

People

Artist: Hamblin Studio, American active 1930s

Date

1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1872

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 97.7
Clothing 94.6
Apparel 94.6
Person 90.9
Shop 89.8
Text 75.7
Poster 71.9
Advertisement 71.9
Window Display 70.2
Brick 59.8
Worker 59.5
Coat 58.6

Clarifai
created on 2023-10-25

monochrome 99.8
people 99.7
adult 98.5
man 96.5
barber 93.7
street 91.8
stock 91.5
sit 89.1
vertical 87.5
administration 85
indoors 84.3
newspaper 83.8
woman 83.7
group 83.4
furniture 82.4
chair 81
text 80.9
coverage 80.5
three 80.3
one 78.7

Imagga
created on 2021-12-14

barbershop 100
shop 100
mercantile establishment 84.4
place of business 56.2
newspaper 29.8
establishment 28.1
building 26.5
wall 24
architecture 23.4
window 23.1
product 22.1
old 20.9
door 17.3
creation 17.1
house 16.7
vintage 16.5
antique 12.1
glass 11.7
city 11.6
structure 11.3
home 11.2
art 11.1
street 11
aged 10.9
dirty 10.8
room 10.7
retro 10.7
travel 10.6
ancient 10.4
brick 10.4
empty 10.3
construction 10.3
historic 10.1
tourism 9.9
interior 9.7
business 9.1
texture 9
detail 8.8
office 8.7
grunge 8.5
buildings 8.5
design 8.4
wood 8.3
exterior 8.3
decoration 8.1
history 8.1
urban 7.9
closed 7.7
historical 7.5
sign 7.5
town 7.4
light 7.4
people 7.3
road 7.2
black 7.2
open 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 100
newspaper 97.7
person 89.4
clothing 87.5
black and white 82.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-68
Gender Female, 56.8%
Calm 69.6%
Happy 27.3%
Surprised 1.2%
Sad 0.5%
Angry 0.5%
Disgusted 0.4%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 27-43
Gender Male, 61.8%
Calm 95.6%
Happy 3.9%
Sad 0.2%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 51-69
Gender Male, 91.2%
Calm 92%
Sad 2.3%
Happy 2.3%
Confused 1.5%
Surprised 1.1%
Angry 0.3%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 71.9%

Categories

Captions

Text analysis

Amazon

HARNESS
SORIES
CO.
SUFFOLK HARNESS CO.
AUTO
ACCES SORIES
SUFFOLK
SHOP
ACCES
FAIR
25
Whiz
21 FAIR
21

Google

SUFFOLK HARNESS CO. AUTO HARNESS SHOP ACCES SORIES Whiz
SUFFOLK
HARNESS
CO.
AUTO
SHOP
ACCES
SORIES
Whiz