Human Generated Data

Title

Untitled (women's group working around a table, Quota Club)

Date

1939

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22280

Human Generated Data

Title

Untitled (women's group working around a table, Quota Club)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22280

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 98.9
Human 98.9
Person 98.5
Person 98.3
Person 96.4
Person 96.1
Person 95.9
Person 90.7
Room 90.1
Indoors 90.1
Workshop 88.9
Clothing 83.6
Apparel 83.6
Person 82.9
Classroom 79.7
School 79.7
Person 79.3
Person 77.1
Table 72.8
Furniture 72.8
People 69.7
Crowd 68
Text 62.8
Housing 60.4
Building 60.4
Symbol 60.3
Lab 58.4
Desk 58.1
Audience 55.6
Screen 55.5
Electronics 55.5

Clarifai
created on 2023-10-22

people 99.8
group 99.5
group together 97.7
man 96.5
adult 95.4
administration 94.5
woman 94.4
many 94.3
furniture 94.2
child 93.9
leader 91.2
military 90.8
war 88.9
sit 88.8
education 87.8
room 86.4
monochrome 85
indoors 83.8
sitting 83.2
chair 80.7

Imagga
created on 2022-03-11

shop 21.1
business 20
table 19.8
room 19.1
people 17.8
design 16.3
work 15.8
person 15.5
mercantile establishment 14.8
interior 14.1
modern 13.3
art 13.2
glass 12.7
technology 12.6
office 12.3
party 12
decoration 11.8
team 11.6
patriotic 11.5
nation 11.4
adult 11.3
commerce 11.2
men 11.2
national 10.9
professional 10.8
patriotism 10.6
male 10.6
working 10.6
state 10.5
group 10.5
place of business 10.5
fabric 10.5
restaurant 10.4
celebration 10.4
flag 10.2
businesswoman 10
rippling 9.8
man 9.7
waving 9.7
businessman 9.7
equipment 9.7
country 9.7
wavy 9.6
world 9.6
home 9.6
ripple 9.5
graphic 9.5
symbol 9.4
meeting 9.4
clip 9.3
window 9.3
worker 9
decor 8.8
medical 8.8
indoors 8.8
wide 8.6
hand 8.5
house 8.4
health 8.3
wedding 8.3
sign 8.3
shoe shop 8.1
furniture 8
medicine 7.9
users 7.9
homepage 7.9
domain 7.9
dinner 7.8
napkin 7.8
e commerce 7.8
scientific 7.7
chemistry 7.7
search 7.7
chemical 7.7
luxury 7.7
laboratory 7.7
navigation 7.7
case 7.7
research 7.6
biology 7.6
elegance 7.6
website 7.5
site 7.5
manager 7.4
instrument 7.4
black 7.2
groom 7.2
suit 7.2
job 7.1

Google
created on 2022-03-11

White 92.2
Black 89.7
Black-and-white 87.8
Style 84.2
Art 80.7
Font 79.9
Monochrome 78.9
Monochrome photography 77.5
Snapshot 74.3
Event 74.2
T-shirt 69.8
Design 68.3
Room 68.1
Automotive design 65.5
Fun 64.9
Stock photography 64.2
Visual arts 63.5
Crowd 58.4
Photographic paper 57.7
Eyewear 56.7

Microsoft
created on 2022-03-11

person 97.2
man 94.5
text 90.6
indoor 86
clothing 78.1
black and white 69.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Female, 96.9%
Calm 99.8%
Sad 0.1%
Surprised 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 48-54
Gender Male, 61.4%
Calm 99.1%
Sad 0.5%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 97.8%
Calm 93.7%
Sad 4.2%
Happy 1.1%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 54-62
Gender Male, 99.7%
Calm 94.7%
Sad 4.8%
Surprised 0.3%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 98.8%
Sad 50.9%
Calm 38.6%
Confused 7.7%
Surprised 0.8%
Fear 0.7%
Angry 0.6%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 48-56
Gender Female, 88.7%
Calm 73.5%
Surprised 13.4%
Happy 7.1%
Confused 2.2%
Sad 1.4%
Disgusted 1.4%
Angry 0.6%
Fear 0.5%

AWS Rekognition

Age 48-56
Gender Female, 84.3%
Calm 99.3%
Confused 0.2%
Sad 0.2%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.9%
Person 98.5%
Person 98.3%
Person 96.4%
Person 96.1%
Person 95.9%
Person 90.7%
Person 82.9%
Person 79.3%
Person 77.1%

Categories

Imagga

interior objects 99.6%

Text analysis

Amazon

TOUO
820le
BOJO TOUO 820le N
N
BOJO