Human Generated Data

Title

Untitled (swimsuit competition)

Date

1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1869

Human Generated Data

Title

Untitled (swimsuit competition)

People

Artist: Hamblin Studio, American active 1930s

Date

1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1869

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Shorts 99.9
Clothing 99.9
Apparel 99.9
Person 99.8
Human 99.8
Person 99.7
Person 99.7
Person 99.7
Person 99.5
Person 99.5
Person 99.4
Person 99.2
Person 99.2
Person 98.9
Person 98.8
Person 98.7
Person 98.5
Person 98.3
Person 98.2
Person 97.8
Military 95.2
Military Uniform 93.1
Sailor Suit 82.4
People 80.2
Officer 80
Army 71
Armored 71
Soldier 70
Crowd 61.4
Porch 60.8
Chair 58.6
Furniture 58.6
Kid 57.8
Child 57.8
Troop 56.6

Clarifai
created on 2023-10-25

people 99.9
group together 98.6
many 98.3
man 97.2
group 95.2
adult 92.2
leader 87.2
crowd 86.7
uniform 85.8
discipline 85
victory 84.4
woman 81.2
child 79.6
partnership 79.4
respect 77.3
boy 76.1
wear 72.8
education 69.8
military 67.8
music 67.1

Imagga
created on 2021-12-14

negative 36.3
film 30.5
blackboard 25
photographic paper 22.1
picket fence 21.9
fence 17.2
people 15.1
photographic equipment 14.7
design 14.6
business 14.6
glass 13.9
crowd 13.4
silhouette 13.2
barrier 13.2
water 12.7
man 12.4
city 11.6
art 11.4
cold 11.2
snow 10.7
liquid 10.4
winter 10.2
interior 9.7
urban 9.6
scene 9.5
men 9.4
symbol 9.4
motion 9.4
light 9.4
reflection 9.3
life 9.2
clean 9.2
ice 9.1
black 9
human 9
structure 9
team 9
technology 8.9
obstruction 8.9
pattern 8.9
group 8.9
window 8.6
health 8.3
drop 8.2
transparent 8.1
businessman 7.9
clear 7.8
architecture 7.8
male 7.8
wave 7.8
modern 7.7
texture 7.6
ripple 7.6
walking 7.6
rain 7.5
person 7.5
outdoors 7.5
adult 7.3
speed 7.3
businesswoman 7.3
wineglass 7.2
women 7.1

Google
created on 2021-12-14

Font 81.4
Vintage clothing 70.8
Monochrome 70.5
Room 68.3
Art 68.3
Crew 67.7
Visual arts 65.9
History 65.8
Uniform 61.5
Team 60.5
Monochrome photography 58.3
Suit 55.2
Collection 54.8
Illustration 53.4
Rectangle 51.5

Microsoft
created on 2021-12-14

text 99.1
window 97.8
person 91.1
old 78.8
clothing 70.9
posing 65.7
picture frame 9.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 67.7%
Calm 37.3%
Happy 31%
Sad 20.9%
Confused 3.2%
Surprised 2.6%
Angry 2.6%
Fear 1.5%
Disgusted 1%

AWS Rekognition

Age 27-43
Gender Female, 92.9%
Happy 60.2%
Calm 16.4%
Sad 10.7%
Fear 3.2%
Confused 3.1%
Disgusted 2.3%
Surprised 2.1%
Angry 2%

AWS Rekognition

Age 47-65
Gender Female, 85.4%
Sad 46.1%
Calm 42.7%
Happy 7%
Confused 1.5%
Angry 1.1%
Fear 0.7%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 40-58
Gender Female, 85.8%
Calm 80.8%
Happy 13.8%
Sad 3.6%
Angry 0.6%
Fear 0.5%
Confused 0.3%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 6-16
Gender Female, 64%
Sad 55.5%
Calm 39.1%
Happy 3.5%
Angry 0.8%
Disgusted 0.5%
Fear 0.3%
Confused 0.3%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.8%

Categories